Forgot your password?
typodupeerror
Technology

Open Source, GIS and Data Visualization? 92

Posted by Cliff
from the revolutionizing-how-we-look-at-data dept.
Mubarmij asks: "A lot of people, including the ex-Vice President, think that Terrain Visualization and the Georeferencing of all kinds of data is the next big thing. Given the broad applications (sims, education, games, GIS, virtual tourism, etc) that can be derived from such technology, I would tend to agree that if this is not actually the NBG(tm), then at least it is very close. Like the internet, this technology has taken its time in obtaining it's current level of sophistication. However, there is huge potential here that has yet to be tapped, despite the fact that it currently fills a huge niche market. I had once read that NASA spends more than 70% of its resources on space imaging and visualization-related activities (unfortunately I link to the article that mentions this, but one should remember that the major goal of all space satellites is to take multihued pictures of Earth and other planets, and you will see that this is not an exaggeration), which is quite a lot of money." Open Source has provided several frameworks for GIS from which a "killer app" may spring from. Read more on the various Open Source projects on GIS, and feel free to share your thoughts on where this technology may head in the future.

"There are quite a few web sites, commercial and non commercial that tend to this technology. However, it seems like the early nineties, where people are just starting to get aware of the internet, but are still awaiting for the killer app to make this thing fly.

There are two open source projects I am aware of that deal with this area. The first, VTP, is a real open source project attempting to create a real terrain visualizer. The second, OpenSkies, is not really open source yet (despite its claim)... but it is interesting in that it allows networked people to fly or drive through virtual worlds that are reality based.

Here are a few other questions:

  • Do you think that this technology will remain a niche market (albeit a big one)? If so, is this likely to occur?
  • Are you aware of any open source projects other than the two mentioned above that deal with this area?"
Interested readers will also want to check out Drawmap and the longstanding Open Source GIS, GRASS.
This discussion has been archived. No new comments can be posted.

Open Source, GIS and Data Visualization?

Comments Filter:
  • by Anonymous Coward
    ...but none of the current stuff will make it big. You guys are missing the big picture. There's something that will eventually grow and will require MASSIVE GIS support. Mobile computing. Here in Europe, location based services are springing up every day. As soon as location of user agents is widely supported by network operators GIS-biz are on their way to provide data to all application providers. They will provide infrastructure for the new mobile internet and that is a massive market.
  • by Anonymous Coward
    Check out vis-sim.org [vis-sim.org], a visual simulation 'portal', and search for 'open source'...
  • by pb (1020)
    VTP looks really cool.

    Source Download Is Here [vterrain.org]

    I'll tell you if I can build it. :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • by pb (1020)
    Sorry, Ben; I couldn't find an easy way to get to the source, but google could.

    Perhaps in the future you might consider using a robots.txt file to help restrict this as well?

    Since it doesn't look like I'll be able to build this on Linux easily, you won't have to worry about my copy of the source. :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • You cannot just sit in your room and hack those out.

    No, but you can start with US Census TIGER data, and work with dozens of other sources of free data.

    I'm currently doing a program that generates databases for a free palm pilot flight planning program called CoPilot [xcski.com]. Getting the data and writing scripts to import it is the hard part, in part because new data comes out every 28 days. The only problem I have right now is that the FAA "order form" for digital data doesn't have any feedback, so two weeks after you filled in the form and you still don't see any data, you don't know if that's because they didn't get the order, or they're just really slow processing orders, or they are waiting until the next data update cycle. So I have some older less complete data from another source for now, but I think I know where to get the latest and best data cheap.

  • The really sad thing is that I worked for ESRI's big competition (GeoVision) in the 1987-1992 time-frame. Back in 1991-2 I got interested in Linux, and I actually considered stealing a copy of the source code from work to try and port it to Linux (which I would have given back to the company). It wouldn't have been a big deal, since it ran on SunOS, Ultrix and AIX. The only problem at that time was the lack of a decent SQL database for Linux. Mostly our stuff ran on Oracle, but it also ran on a few others. The other major problem was that the only way I could transfer all the source code for a huge GIS application from work to my home PC was on 3.5" floppies. That would have been more painful that installing SLS 1.03 from 5.25" floppies, which I had also done.

    GeoVision's product is still around, after going through a series of owners. It's still a damn good GIS. And it would probably be dead easy to port it to Linux now, since Oracle is on Linux. Heck, I think even Oracle MultiDimension, a product I worked on at Oracle, is available for Linux.
  • There is an application called Karte [sourceforge.net] that I have been working on that allows a user to scan a collection of maps and georeference them. Then the user can trace routes and set waypoints for downloaded to a GPS.

    The application is currently under heavy development but most of the features for maintaining a database of maps exist. The application can be had from CVS at sourceforge.

  • Has anyone seen the well developed, supported, and implemented OpenGIS [opengis.org]?

    --

  • And it implements at least portions of the OpenGIS consortium Web Map Server spec, although it hasn't passed its validation tests. This means that it just might interoperate with everyone's favorite GIS 900 lb Gorilla, ESRI. It already reads shapefiles and talks to ESRI's Spatial Database Engine.

    This is based on the claims on their web page, I (coincidentally) just found it the day before I read this and haven't had a chance to download an play with it. The ability to talk to SDE could be hugely significant for the projects I am involved with.

  • hey gleam. whats up. oh...aerial mapping :P hehehehe
  • http://mapserver.gis.umn.edu/ truly open source, support Linux/Apache, most Unices, and Windows family too. -- andika
  • by unitron (5733)
    Next Big Game?

    Or was that supposed to be NBT (Next Big Thing)?

  • If you have used Arc/Info 8 or above, you know that everything is now using COM. They have also move all the personal geodatabases to use JET. The most awful piece of trash db interface / engine. Microsoft has given them lots of money to make sure nothing but Windows NT / 2k gets supported from here on out. They are even dropping support for Irix of all things. Fuck ESRI. BTW - has anyone noticed they really are following in Microsoft's footsteps.. Arc/Info 8 is the most crash-ridden, buggy package they've put out to date! And with the ultra-stable underpinnings of a Microsoft OS and API, it can only get worse!
  • by MAXOMENOS (9802) <maxomai AT gmail DOT com> on Friday April 13, 2001 @03:39PM (#292934) Homepage
    I'm not sure where you're getting this idea that data visualization is a matter of wanting things to be too easy. The fact is that a picture does a great deal more to help us build a model of how a pheonomenon works than a whole shitload of numbers in a table.

    Human beings are pattern recognizers. Our hardware is wired best to look/listen/feel/taste and match that stimulus to some other previously established pattern. We put a lot of effort into data visualization because it is easier for us to look at a picture and create a pattern than to look at raw data and create a pattern. Even the most difficult mathematical phenomena (noncommutative groups) make better sense when you give someone a concrete or highly visual example (rubik's cube, arrangements of books, etc.)

    Sometimes the most brilliant of mathematical work comes from building a simple model that greatly simplifies the work (think Feynmann diagrams).

    This can even apply to theology: when God intersects in our lives, we try to create a pattern to explain what God is, even though God is unfathomable. And thus we end up with many people who end up believing in Jesus or Allah or Nirvana, and then some people who end up believing in the UFO behind the comet. This also uncovers the problem with pattern matching: sometimes the patterns we find are dead fucking wrong (Aristotle). Fortunately we also have scientific method and reason to weed out self-contradicting, nonsensical, or otherwise false patterns.

    ObJectBridge [sourceforge.net] (GPL'd Java ODMG) needs volunteers.

  • Although it is difficult for many of us to see things, such as The Lord, the fact is that after some proper imagination and hard work, anything is possible.

    You made a Hell of an interesting point. Exactly a Hell of it. As there is a WHOLE set of testimonies of visions of Hell. And a few ones of Heaven. Now wouldn't be interesting to MODEL them? Maybe they would help some people to see what is waiting them.... And help morale 'round here on Earth.

    Anyway that's not for me. I, like every old Hacker who lived the end of the XXth century, will go to Heaven...

    "Ok my son, you are in Heaven, what do you wish?"
    "Infinite munition/all weapons... God Mode... and send me to HELL!!!!"
  • I know exactly what you're talking about. In a previous job I had, they'd spent about $2 million on a GIS up to that point. I worked in the plan room, where we maintained the old paper & mylar drawings, including maps.

    The maps the GIS people produced were supposed to replace the old ones eventually, but had lots of problems. For one example: on their screens, they used lots of colors and even printed them in color, but when copied they looked really bad and light colors disappeared.

    Also, the only access to the GIS was in their office, and they were the only ones allowed to use it. They had some gee-whiz demos, but nothing very useful. That was just my first exposure to GIS though, about 7 years ago.

    I think the situation has improved since then, especially since many GIS products can use a web browser as the client. There are many opportunities for open source/free software to commoditize the GIS backend & use web browsers as the frontend.

    Now I've gone back to school for civil engineering, and I'm thinking of building a GIS for some municipality as my senior project, using all open source tools.
  • Your comments are right on the mark. Too many people simply think that GIS = "visualization". GIS is not so much about visualizing geographic data (although that is important, it is also the "easy" part), as it is about collecting, organizing, managing, and analyzing spatial (geographic) data. There are all sorts of crucial issues involving spatial and temporal accuracy (too often neglected by uninformed users of such systems), not to mention issues involving the properties being measured. Creating/updating data is expensive. And don't forget the Metadata records to keep track of everything! Spatial databases can be far more complex than ordinary relational databases because of varying spatial relationships. Analyzing such data is also much more involved than most basic statistical methods allow for. It is not too far from the truth that may users of commercial GIS systems don't have much of a clue regarding many of these issues. Certainly very few open source projects deal with all of them. GRASS sort of does (it has hooks to various databases and to S-plus) but in many ways it is a software dinosaur (but hey, it works!)
  • The "Open" in Opengis should not be confused with the same meaning as "Open" source community or even "open" to the public. This is a closed organization (paid membership only) that works on standards behind closed doors. As a result, an "industry" bias is to be found in the specifications. "Open" only refers to be possibility of data interchange between vendor's systems (sort of like Microsoft claiming that its published APIs are "open".) Sure, they eventually publish some standards after the fact - but how many vendors are really implementing them?
  • Hey - don't forget that other open source Java-based visualization system VisAD: http://www.ssec.wisc.edu/~billh/visad.html [wisc.edu]
    This one is more raster-based (e.g. topo and satellite data) rather than vector-based. But nonetheless, quite impressive software.
  • I don't disagree - I think that it would be great if there were open source implementations of these various interop GIS standards, an end-run around the vendors as you mention. However, this cannot be done if you don't have all of the standards :-) And it would be very helpful to understand the evolving arguements and logic that go into the various proposals and developing standards - thus my complaints that OpenGIS is not very "open". Considering that OpenGIS has been around for nearly 8 years now, it really hasn't made much of an impact - which is unfortunate (and I certainly don't blame the original organizers of OpenGIS - in my view, the commercial GIS industry has been dragging its feet, perhaps not surprising). These views about OpenGIS are hardly mine alone - there have been similar complaints published in the trade literature over the years. It would also appear that how much gets published from the various Opengis subcommittees is highly dependent upon the views of the individuals/organizations on those various subcommittees (i.e., some are more "open" than others).

    Now indeed, various groups/individuals have been attempting to implement these OpenGIS standards as best they can given what is available. You might have a look at:
    geotools [sourceforge.net]
    OGR Simple Feature Library [velocet.ca]

    While not OpenSource, the following is an interesting implementation of an OpenGIS compatible map server:
    DEMIS World Map Server [demis.nl]

    (This OGC Web Map Server spec can be found at: 00-028.pdf [opengis.org])

  • Well, firstly that gawd-awful "shapefile" format was created (and since modified) solely by ESRI.
    An "industry standard" perhaps - as ESRI controls 60-80% of the GIS market (depending upon what segment you look at). Secondly, a shapefile is not the standard way in which Arc/Info stores its data - these binary formats (e.g., the "e00" files) are entirely proprietary. There have been some attempts to try and reverse engineer this format - use google to look up "e00 format".
  • >Really, I wish the GRASS project were moving faster (or moving at all).
    It is moving, there is even a Cygwin version [uni-hannover.de] nearing the beta stage. You just have to watch the mailing list [uni-hannover.de], also the European website [uni-hannover.de] is usually more up to date than the North American one.
    >So if you want to open one of those, now standard file types, better be prepared to pony up some real cash.
    There are libraries and utilities which allow you read and write native ArcInfo file formats. The best place to start looking at these is on FreeGIS.org [freegis.org] in the conversion tools [freegis.org] and libraries [freegis.org] sections. -matt
  • ...the technology is easy, the data is hard.
    Better said: technology relative to the data is easy, from our current place in GIS development. {Aside} As we refine our GIS technologies move into areas focussed more on the relationships between data nodes/collections the technology gets much more complicated. However, in the main, I agree with you, the data is hard.{/aside} The data side of GIS is even harder for those who live outside the US and New Zealand. Most countries have placed cost recovery principles at the forefront of their geospatial data distribution polices. From an economic perspective this seems to make a lot of sense, building quality GIS data is *expensive* and recovering some of that expense is an attractive idea. However in practice it doesn't really work, at least not here in Canada, the area I am familiar with. What happens in reality is that only big corporations and government have deep enough pockets to afford the data, so the smaller organizations either attempt to recreate it themselves (at varying levels of quality) or do without it. Then another project comes along which needs both datasets and the data has to be re-worked *again* in order to make the two datasets match each other. Of course so much money and effort goes into this data massage that nobody wants to give all that work away... and so the cycle repeats, bleeding money with every spin. The other arm of this cost recovery pincer is the use and distribution license terms. In order to keep charging for the data, users must not be able to redistribute the data. So if I use a watercourse described in the Canadian National Topographic Database digital base as a boundary polygon for my own data, under the terms of the NTDB license agreement I would not be able to distribute that boundary to anybody (eith or without a monetary transaction) without paying a royalty to Geomativcs Canada since the enduser would be able to extract the coordinate pairs from my poly boundary and recreate the original (copyrighted) NTDB watercourse. Add to this recipe that the premier users of GIS data are governments it means that, in the Canadian case, tax dollars paid for the data collection in the first place, and then tax dollars are used again to allow other governments, and other branches of the same government, to use the data. The result is no cost recovery taking place. Anyway, this is all an introduction as to why the The Canadian Free Geospatial Data Committee [home.net] was created. There is a petition [home.net] to help Geomatics Canada change their distribution policies as well as a comments [home.net] page. If this issue interests you please visit the site and share your views. -matt
  • GIS technologies have many applications beyond the arena of geographic information systems. Additionally, it should be noted that GIS itself has benefited from the convergence of many different fields that have been traditionally open source. GIS is essentially a combination of a number of different areas of study that resulted from needs outlined by the DOD and NASA to study planets remotely and to study the earth from airborne platforms and more recently, (starting with Corona) space. Mathematics, geometry, and statistics (all open source at heart) are at the roots of GIS. Where future development of GIS technologies will go is anyone's guess, but should not be limited to traditional areas of GIS application, and realizing the potential benefits of open source paradigms can help a variety of fields that have fairly esoteric uses that can have huge impact upon society. For examples consider that image classification algorithms (both supervised and unsupervised) can be applied to many areas of bioscience to determine crop health, or determine tumor potential in both plants and animals. We use GIS tools, specifically unsupervised classification routines such as k-means and ISODATA to determine the biochemical phenotypes of tissues at cellular resolution, and the whole field of genetic microarrays has recently started using these same clustering algorithms to mine gene data. Other examples include ophthalmology which has benefited along with obviously dermatology and computer vision (in hundreds of applications from food safety to manufacturing), political science, geology, etc etc etc....
  • I would agree that implementation of OpenGIS standards (before their web mapping work which is widely supported) has been very disappointing.

    However, even where not widely implement as defined in the specification, early efforts like "OpenGIS Simple Features" have had a broad influence on the industry.

    Furthermore, recent ESRI offerings (ArcGIS 8.1) includes support for the Simple Features OLEDB interface, and AutoCAD MapGuide supports SF OLEDB as a data source.

    I have made extensive use of the OGC "Well Known Text" format for defining coordinate systems, and have also implement some simple features work as shown at:

    http://gdal.velocet.ca/projects/opengis/ [velocet.ca]

  • One open source project not mentioned is CAVOR, ("Creating A Vision Of Reality" if you insist that stand for something). (See the Cavor project page at SourceForge [sourceforge.net])

    This is primarily the engine for a GIS, the database, graphics display, user interface, scripting, etc, that developers or end-users can further tailor to their specific application. The same engine can be used for other applications that include both a spatial/graphic part and a database part (eg CAD, UML diagrams, PERT charts, etc).

    It's still in development -- the display list manager is working (and used in a spin-off, 'cvv'), the database interface and scripting language is in progress.

    The architecture is based loosely on GeoVision/SHL/AutoDesk's high end "VISION*" system. (In its time -- before bankruptcy and transfer of the technology to SHL then AutoDesk -- GeoVision pioneered several of the technolgies widely used in GIS systems today, such as storing the spatial data in a relational database.)

    The home page is at http://www.cavor.org although I'm not sure the server is up to a slashdot effect.
  • GIS just involves a lot of data

    Yep, and it used to be that that data was very expensive -- involving lots of hand labor digitizing maps and aerial photos. That's still true to a certain extent, but much less so.

    I've gone from one elegant-but-buggy GIS product (VISION*) to just doing "GIS" by connecting my CAD maps (Microstation) to databases,

    I'm curious as to when you were using VISION* and it what context. It's been nearly a decade since my involvement with it (I did a lot of the requirements analysis and design for 2.0 -> 2.1).
    And connecting CAD to a database is a nightmare -- I was once involved in converting a hybrid AutoCAD+dBase system to VISION*'s forerunner. Yuck!

    Oh, and PS, that bodes ill for Open Source GIS software outside the academic world, because big organizations have a positive fondness for the "Microsofts" of any industry they buy from, and an aversion to "unsupported" products.

    Sure, that's AutoDesk (who currently own VISION*) isn't going to be worried about CAVOR [sourceforge.net]. Were I major telco or power company or large regional government I'd go with them for the support, training, handholding, etc. (Heck, when I was with GeoVision we wouldn't talk to anyone with less than a quarter million to spend).

    But there's still a niche for open source GIS. Heck, look at the "travel map" software out there -- that's a rather simplistic GIS, to be sure, but it shares common elements. Think of what, say, real estate agents could do with a GIS system.

    And beyond that, think of the other application domains that share characteristics with GIS. You mention trying to tie a CAD system and a DB together. How about a CAD system with a built-in DB interface? Or any number of other applications that have both a spatial or graphical aspect and a database aspect -- software diagramming tools (UML, other CASE tools), project management (PERT charts), various sorts of CAD, and so on. (This is part of what the CAVOR project is all about -- I always felt that GeoVision never quite understood the potential of their technology. I once prototyped a CASE tool (DFD's, ERD's and the like) on VISION* in about a day.)

    Give folks a free toolkit like that and the uses and applications will come, and they won't all (by a long shot) require gigabytes of data. It's when the tool cost is high that it's only used by those with a lot of data to manage.
  • Hi Paul!

    Take a look at Cavor (http://sourceforge.net/projects/cavor/). It isn't exactly VISION*, but the overall architecture is similar and that's my model (as much as I can remember, anyway).

    Cheers,
    -- Alastair

    (Another ex-GeoVisioner -- and yes, the architecture beat heck out of what ESRI was peddling back then. I haven't been close enough to it lately to know about now.)
  • by LL (20038) on Friday April 13, 2001 @03:23PM (#292949)
    This is one of my pet peeves so bear with me for a while. Non-experts routinely consider satellite drapped imagery etc as visualisations. Strictly speaking, data visualisation is *INSIGHT* not graphics. It comes down to how efficient is the enscapsulation of high level knowledge. Take for example, the difference between a sat-photo of a hurricane and a meteorological map. The first is visually rich (color-enchanged, digitally sharpened etc) but the 2nd is more useful as it codifies the fronts, pressure zones, wind direction etc. The processed information is specifically designed to remove clutter and enable an expert to quickly determine key features and artifacts. The same principle applies to GIS. People are not interested in pretty pickies, they are interested in teh rate of containmanation flow, the distribution of traffic density, the classification of vegetation regrowth, etc. GIS is an enabling technology and yes it does require lots of graphics processing but the real work is in preparing the data, not in the visualisation. Working with one Natural Resource Management group, they said that 80% of the work is in cleaning up the data. Visualisation is just a tool to help them accelerate the process of finding bad data.

    As for the field of displaying quantitative information, the recommended books are Tufte [yale.edu]. It is actually quite hard to create intuitively understood data visualisations because our eye-brain can only measure simple things like intensity, distance, etc. That's why things like pie charts where the angle is directly proportional to the propoertion works. All the other data visualisation techniques (parallel coordinates, tensors, etc) require a fair amount of training and patterning before you can pick up the meaning. A geologist (or related discpline) would be able to look at a contour map and be able to "see" in eir mind's eye the slope and elevation. Lesser mortals would probably require a pan of a 3D VRML model and even then have difficulty in recalling specific features. Adding extra layers or texture maps might be aesthetically pleasing (cough*QUAKE*cough) but doesn't really add any extra information.

    LL

  • by Xthlc (20317) on Friday April 13, 2001 @02:49PM (#292950)
    It's mainly for the folks who need a 2D, Java-based solution. It's pretty darn useful; lots of great projection math utilities, a nicely structured architecture, support for a wide variety of GIS products out there, and has transparent support for delivering imagery over a network. Plus the platform of choice for primary developers (BBN) is Linux. :) http://openmap.bbn.com
  • I agree, ESRI is worse than MS. Government agencies all over the world are now all migrating to ESRI as ESRI locks up the market. ESRI now owns the default file standard (just as MS owns the Word .doc standard and can keep messing with it to make life difficult for competitors). But in ESRI's case, a seat at ESRI's ArcInfo costs about $18,000. So if you want to open one of those, now standard file types, better be prepared to pony up some real cash. My small town government can't afford this, and the county is even more financially strapped, but they feel they must in order to keep up to date, technologically.

    Really, I wish the GRASS project were moving faster (or moving at all). It was well positioned some years ago, but hast lost badly to ESRI in the past year or two.
  • I have seen companies save thousand, nay millions of dollars through the use of GIS. Saturn, a telco in New Zealand is one example.
  • With a open moving-map and route-planning program, the market would be for the data which you mention. Sure, TIGER is a good starting point for the USA. Would you be willing to pay a few bucks more for map updates, with food/gas/hotel additions? Many people would.
  • I will mention that there is a moving-map program for Linux which seems to be almost ready. I won't give a link so as to not distract the cathedral builder.

    There is also an open routeplanner [sourceforge.net] project.

  • Esri HAS competition . For example Smallworld [swldy.com] GIS [swldy.com] Is available For Linux. Smallworld GIS really beats ESRI in all aspects except for perfomance and price. It has a java-like portable virtual machine, and a smalltalk-like programming language, which probably scare lot's of potential users.
  • Amen ;-)

    The grandparent post makes no sense, how is using computers to visualize terabytes of data "cheating" or somehow being "Un-Jesus like"?

    and why jesus figures into a conversation about GIS is way beyond me...

  • I don't understand why this has be moderated up to 4. Why I admit it might be an aceptable philosophical opinion for some people, it's completly off topic. I have noticed there is a serious flaw in the way people moderate and post (as some off topic subjects are brought to people attention by higher moderation) this makes the discussion to in directions which don't have nothing with the initial subject and this really annoying.

    Please don't consider this as troll, I just consider the question asked to slashdot very interesting,can we please stick to just "Open Source GIS" ? please don't bring vaporous philosophical noise.
  • by q[alex] (32151) on Friday April 13, 2001 @03:04PM (#292958) Homepage
    everyone remember the challenger explosion? anyone remember that a couple of NASA scientists had done a bunch of research on o-rings and had determined that under specific conditions, that they wouldn't work as they were supposed to? but the challenger launch wasn't stopped, because the research was presented with such poor visualization that the managers of the projects (who didn't really understand the science) couldn't make informed decisions. part of visualization is helping really, really smart people show off their ideas to the rest of us.
  • You have a point. Perhaps the Einsteins of the world shouldn't use advanced graphics. However, for the rest of us, it helps a whole lot. So, this is not an effort to help the best and brightest achieve their best, they'll do that no matter what. Rather, this is an effort to help everyone else.

    And you really don't have to bring your religon into things to make a point...
  • It's not clear what you mean. OpenGIS is an standards organization, not a product. Perhaps you are referring to their Web Map Server specification, probably the first meaningful standard they've ever produced. It makes it fairly easy to make maps available to the web in a vendor-neutral way so that they can be overlaid with maps from other independent servers on the fly.
  • Also, the only access to the GIS was in their office, and they were the only ones allowed to use it. They had some gee-whiz demos, but nothing very useful. That was just my first exposure to GIS though, about 7 years ago.

    7 years ago? Yes, things have changed.

    I think the situation has improved since then, especially since many GIS products can use a web browser as the client. There are many opportunities for open source/free software to commoditize the GIS backend & use web browsers as the frontend.

    Sure! The problem is that there STILL isn't a really good GIS system for the internet. GIS is dynamic, not static. Half the solutions are based on server side scripting which requires a new download for every view change, addition of layer, etc. The client-based controls/plugins aren't that great either. Most of them just use the same mechanism that the server-side stuff uses. Painfully slow.

    The only solution that I have seen that has some REAL promise is a guy at Montgomery Watson who has created flash-based GIS systems. Data is dynamically downloaded. Just don't expect to see it on store shelves anytime soon. :(

  • Arc until about rev7 was great, but ESRI are progressively turning their product line into crap. They are also comitted to tying their product line inexorably into COM/DCOM.

    There are far better people to be courting, people who are actually doing more than lip service to OpenGIS standards (AFAIK ArcIMS *still* doesn't do OGC compliant queries and Geography Network looks like their own duplicated "standard" for the same funtionality).

    And let's not start on the Shape file format, that thing is an abomination compared to GML and should not become a defacto standard a spatial data format.

    Xix.

    P.S. WTF was this posted over Easter when I was away from net access??? Will anyone actually read a /. article more than 12 hours old???
  • It is either a "crutch" or a method of enhancement. Depending on the level of material one is studying, I'd ask, "whats wrong with using a crutch"? Two girls dropped out of Solid State Physics last semester because (primarily) they were having trouble visualizing the crystal arrays well enough to express analytically therelationships necessary for solving homework problems. 3D arrays are not easy to put on a chalkboard. Fourier Transforms, while simple enough (or at least familar enough) in 1D, were new to them in 2D, and 3D blew them away. Inverse transforms on optimized K-space surfaces? Huh? It would help to be able to see LOTS of examples in an interactive enviornment.

    Also consider that for every analytic definitin or theorm there exists a geometrical interpretation. These geometric interpretations aren't proof-elements, but they *are* comphrehension elements. I would suggest that historical the geometric interpretations have lead to the abstractions. To suggest that abstraction transcends geometry is true. It also trancends understanding.

    "The proof works".

    "What does it prove?"

    "I don't know, but every statement is a true statement and it procedes from our axioms to the collection of symbols we were shooting for!"

    "So it doesn't really mean anything?"

    "We trancended meaning last semester."
  • I would be remiss if I didn't at least mention my company's product World Construction Set [3dnature.com].
    Sounds like something you might have a use for. We're not Open Source (and we would be starving and the product wouldn't exist, if it were ;) but we try to avoid all the traps and tricks of proprietary software vendors that irritate us all. Sorry, no Linux version yet, just Mac and Windows.
  • Ten years ago I went looking at GIS systems for my employer, and the slogan you used to hear was "If you have to ask the price, you can't afford it." The reason, as many on /. have pointed out, the cost is the data, not the software.

    So how do you get a lot of data quickly and cheaply? It's another case for the cornucopia of the commons - distributed data collection.
    Just as Napster turned the net into the celestial jukebox, and SETI at home harvested the
    computer power of the net, how could we turn
    everyone into Geographical data collectors?

    Two devices that could play a part : (Global or equivalent) positioning in mobile phones. And digital cameras.

    People are scared of being tracked by the position of their mobile, but anonymous tracking could be really useful. Think of the number of people who walk down the same streets. Drive down the same highways. Now imagine that the company who ran the mobile phones, also recorded the trajectories, of the phones, averaged and picked out the trends. What you'd end up with is a pretty accurate map of most of the heavily used roads and pedestrian walks within big cities, plus the major highways, outside.

    Now imagine putting position recording into
    other devices. Digital cameras which position and orientation stamp every picture they took, but also perhaps, make an estimate of their height when the picture is taken. When you upload your images to myPhoto.com, behind the scenes, the elevation of these points is being captured.

    The key here is anonymity. No one wants their
    devices tracking them if it tracks them personally. But the chips which do the data capture in the device can be open designed, so any expert can check that they aren't storing a unique ID number; and signed so that you can test
    that the chip in your device is legit (and not a government spook chip)

    phil
  • HOWEVER, what OGC *is* working on, and in a fairly fractious and "open" manner, is standards for consistent handling of geospatial data. I can speak to some degree about datums, ellipsoids, geoids and coordinate transformations, and the infusion of new blood (Royal Dutch Sell have added a really talented geodesist) has been marvelous.

    At least in this subcommittee, the intent is a consistent handling of transforms, something missing in the industry save among academics and folks like the National Geodetic Survey, where doing it "right" is more important than having a new "tool" to do it differently.

    Further, the publication of GML-1 and the current efforts to make GML-2 a standard are significant events in facilitating datasharing. Real significant.

    No, they're not dedicated to developing open source software. But the idea is open data standards so data can be distributed amongst the various packages represented, and some of the governmental agencies represented DO have open source in mind for their apps.

    --gerry
  • Rumor has it that enough of us HAVE requested ArcView and ArcIMS for Linux that they are going to be in Beta sometime Q2 or Q3. I'll see what I can do to confirm this next week...
  • All this info about GIS is nice, but it seems to me to be a little high-end: it's not aimed at the needs of the average user.

    What I would like to see is somebody making an application like Delorme's AAA MapNGo or Microsoft's Streets and Trips available.

    However, those sorts of programs are difficult to do under the Free Software model. The code isn't hard (no harder than a browser or game engine) but the program is worthless without data. And, unlike a program which you can stay in your little room and write, data requires you to have detailed maps of street locations and interconnections, locations of attractions, hotels, restaurants, and gas stations. You cannot just sit in your room and hack those out.

    Now, if we could just persuade Delorme that the Free Software community is a good market...
  • I've been working in municipal GIS for several years. The very first thing every consultant told us, was also the first thing you hear from every organization that's run big GIS projects: the technology is easy; the data is hard.

    GIS just involves a lot of data - and a lot of problems cross-referencing separate but related datasets. In our case, its the (say) water, sewer, parks, roads and legal maps (and 22 others) that are all drafted separately but must register on top of one another correctly to create synergy.

    I've gone from one elegant-but-buggy GIS product (VISION*) to just doing "GIS" by connecting my CAD maps (Microstation) to databases, to a second run at GIS with the Microsoft of the GIS world (ESRI). And they were right; the technology was never my big problem. Staying on top of the ever-mutating dataset is 98% of the work.

    I see GIS having big effects in the government and industrial world but almost none at the consumer level, because you have to be a big organization to manage enough *data* to need a GIS to understand it all.

    There will be consumer products and services - your PDA will show you the direction to the nearest Thai restaurant - but the only technology you'll need to get this from some remote GIS engine will be a browser.

    Oh, and PS, that bodes ill for Open Source GIS software outside the academic world, because big organizations have a positive fondness for the "Microsofts" of any industry they buy from, and an aversion to "unsupported" products. Hate to be the messenger, but its just a corporate-culture fact of life.

  • Good hello.

    I'm the instigator of the planet-earth [planet-earth.org] project. I'm a mere interface designer myself, but we do have real hackers involved as well. We have a CVS running at sourceforge, and are starting to document our architecture. More info is, of course, at the website [planet-earth.org].

    The main insight driving our project is derived from the data-is-difficult problem noted in another thread. We distribute the data-collection tsk among all our users. This is, of course, the only approach known to scale. Everyone knows a little about their local area. Handheld GPS units are becoming widespread, which makes things even easier. The readings from these things are superseding survey data these days.

    Even if you don't know your lat/long, you can still contribute to the planet-earth database: it's an immersive 3D navigable representation, with landmarks, waypoints etc. - so just zoom around until you find your house, and attach some metadata to it - This is My House! or This is My Local Cafe and it has Very Good Coffee but lousy service. If there's no geometry there, then upload a VRML file, or choose from the library of available archetypes.

    You don't have to navigate it in 3D if you don't want to; data is separated from representation, in other words, we do XML.
    We allow multiple conflicting data and geometry at the same point in geospace/time, no problem - it's up to the user's filter set to decide which version to display, based on their preferences. Dynamic filter queries, good stuff.

    There's plenty more to this, of course. Ask me in this thread, or email (address at the website).

    V.


    [ hypermedia | virtual worlds | human interface | truth | beauty ]
  • Hey there Doc.
    planet-earth [planet-earth.org] uses GeoVRML [geovrml.org] as the basis for our main data representation. See the thread around 4 below entitled Remember 'Earth' from SnowCrash? for more info; you may have to reduce your threshold to see it, it's currently rated (Score:2, Insightful).
    Note that there's an open source GeoVRML implementation, and planet-earth itself is GPL.
    [ hypermedia | virtual worlds | human interface | truth | beauty ]
  • We at eParka.com have been thinking about how to manage our in house GIS code. Dunno if we will opensrc our software or our proprietary databases (we have the entire country online in our own format). It is interesting field to watch, but there is a lot of junk out there.

    -Moondog
  • When I first read the headline I thought it was about Geek in Space. oops.
  • I considered making GenesisII [geomantics.com] - our Landscape Visualization System - open source just over a year ago and posted about it to various newsgroups. I decided against it because there was very little response.

    Landscape Visualization has been the next big thing on and off for about 10 years now. I still think we're about 2 chip generations behind it being for real, plus a *lot* or work needs doing on basic 'real world' algorithms.

  • by GMOL (122258)
    Brought to you by the good folks at
    http://www.kitware.com/, yes its' open source.

    VTK has been used in a lot of data viz.

    I did get around to using in my project because
    it *seems* (could just be my lack of expereience) more difficult to visualize dyanmic data, and things like VAS are more suitable.
    But it's a really nice piece of software (wrapped in many languages including python!)
  • I cannot argue with any of your points about ESRI. I am in an awkward business position, though, and really would like them to "see the light" through a little bit of positive reinforcement. The ESRI staff is pretty unimaginative on Open Source, and maybe Jack just needs some more info. Thanks, G
  • I have tried for four years to get the GIS industry leader Environmental Systems Research Institute (ESRI) to port its products to Linux. They have expressed interest, but so far, have ported zero product. The problem is that they have not only leadership, but actual dominance in the industry You can send email to jdangermond@esri.com. Jack Dangermond is the president of the company, and a reasonable man. He has stated in the past that product ports are driven by user demand. Please be polite, ask for the specific product, and refer him to this slashdot site. I think that if he sees that the userbase is considering jumping to Open source, it might help his decision. Thanks Greg
  • Anyone remember the "Earth" application from Snowcrash (Neal Stephenson)? That seems to be the ideal point towards which all of this is evolving.. check out the Planet-Earth [planet-earth.org] project which is directly inspired by Snowcrash as well as H2G2 [h2g2.com] and Everything2. [everything2.com]

  • by tcc (140386)
    You're right, it's the next big thing, so big that when all the mp3 of the planet will be leeched, and all the existing porn seen, DEMster will come out and we'll be able to share DEM and visualisation maps of the planet, I can't hardly wait to cut off my neighbours's bandwidth to share these cool and huge files :)

  • this is for 3d representation of gis data
    http://www.ai.sri.com/geovrml/

    very cool!
    the http://www.parallelgraphics.com
    cortona vrml plugin has support for it too

  • Actually, the libraries for the ArcSDE (Spatial Database Engine) C API are available for RedHat. The client libraries also.

    ArcSDE sits on top of an RDBMS and adds spatial datatypes (and rules... unlike Oracle Spatial) to your data.

    But I agree, it would be nice to get more.
  • What do you mean by "owns the deafult standard?"

    The ESRI shapefile technical description is available here [esri.com].

    That should tell you everything you need...

  • And geotools, which does the same sort of thing and is GPL: http://geotools.sourceforge.net/
  • Free GIS URLs
    http://freegis.org/
    http://opensourcegis.org/
    http://www.opengis.org/

    Interoperable Webmapping
    80% of the cost of mapping projects is finding basemaps and integrating with them. There are huge amounts of GIS data around, and being able to store current versions of all the GIS data you require is often unobtainable.
    Consequently the Open GIS Consortium are developing where data custodians look after the data, and serve it to the web, then application builders access these data layers (from multiple servers) merge them then present them to a user.
    This is all being made possible by the development of standards for a series of Network Addressable Services.
    Cameron Shorter - Web Mapping Manager
    Social Change Online http://webmap.socialchange.net.au
  • I currently use GIS data to build 3d trail maps, "fly-by" style video, and expansion visualization for ski resorts. There is a lot that can be accomplished with geo data, but I don't necessarily think it'll change the way we all live.
  • Mapserver, written by Steve Lime, Minnesota DNR
    http://mapserver.gis.umn.edu/ [umn.edu]

    Excellent program!
  • There is one really great open source solution to doing GIS on the web, mapserver, written by Steve Lime, http://mapserver.gis.umn.edu [umn.edu] It uses Freetype, proj, and has a perl module, mapscript. I have used it for small projects and it works very well, much, much better than ESRI's ArcIMS. Mapserver is much faster at drawing maps, easier for the end user because you can configure custom interfaces.

    That all being said, is GIS the future? For the web, I don't think it is. Maybe the distant future. I mean the concept is good. Put in your address, get a map, turn on layers of data to see the things you want to see. But when you actually try doing something like this you find out it is really hard. I work with an organization which has some GIS data. The data is not real clean - getting good clean data with well-built shape files is hard. Most of the data they have is point source data - really ideally suited to having some static maps drawn up. Instead this coming Tuesday I have a meeting with my boss and others because there is a huge push in my company to do a full-blown web/GIS front-end for every bit of data in the house. It's nuts! ...But I tell you, nothing sells to suites better than those dopey zoomable-layered maps. I can show her simple text query forms that return text data and a map at the end all day long, she couldn't care in the least. I tell you what at an ESRI demo her eyes lite right up! She has no concept of the idea we as a company are years and million$ away from having enough good, correct data to do GIS right. Getting good data is very hard. Storing, organizing, moving around, making available to employees, good data is also very hard. GIS chews up megabytes of storage almost as good as video production! :-)

    Does the public want some graphic map they have to figure out, click on multiple times and zoom around in just to find out there are three toxic waste dumps within ten miles of their house? Really, I think the public wants a list. A list is easy to understand, it is not critically dependent on having a perfect set of UTMs and perfect GIS data and it doesn't use multi-hundred meg shape files so queries that produce lists seem much faster.

    Who knows, maybe someday we will all have really big pipes coming into our houses and thanks to Mr. Moore, servers will be even faster and maybe GIS will be a practical front end to a data heavy web site. I just don't think that time has come yet - and won't for a while.

  • Mark, I was wondering if someone would call me on my relatively imprecise langueage. You are - of course - correct that opengis.org is a standards setting organization rather than an organization dedicated to open source. My point is that the corps behind it have been slow in implementing thier own specs and that the open souce community can do an end-around and implement them more fully than the participating organizations. Also, I think that it is a bit short-sighted to believe that opengis.org is some sort of monstrous ms in waiting. In fact, they open up specs for public comment, etc.
  • Opengis.org [opengis.org] is the best organizational reference for open GIS standards. They have an international consortium of business and government agencies behind them. They have been around since 1993 and have developed several standards for developing a true open framework for GIS delivery. In fact, GIS is one of the rare applications that demands a very open approach since having geographic data is only useful as it relates to other geographic data.

    Opengis.org has done a good job of specing out systems that are truly interoperable because they achieve GIS nirvana: seperating content from visualization. Reading GIS content from multiple servers and displaying it through a single user interface is the heart of open GIS. Amazingly, no major commercial vendors (ESRI, Bentley) are aggressively pursuing this vision. IMHO, this is an opportunity for the open source community to make a mark on a major emerging industry! If you are interested in working on developing an open source version of the server spec that Opengis has released, please contact me!

  • Every ground pattern must be further encapsulated into a map entry, probably stored either in a hashtable or a db, when the load becomes to great for a normal memory mapping to effectively map to the physical display....this is a key stumbling block to GIS and GIS won't be able to go mainstream until it's overcome - people don't want to see slow, chunky terrain in polygons
  • Hey Andy,

    I'm working on this project too (albiet slowly) and the primary problem of 99% of GIS software is it's only good for visualization, not print media. GRASStep will support WYSIWYG printing with small print files.

    There is also talk on the Quesa project about creating a PDF renderer plug-in which would also serve this purpose for 3D stuff.
    ---
    >80 column hard wrapped e-mail is not a sign of intelligent
  • i wish this article came out *before* i wrote my paper on evolved habitat preferences and virtual environments. . .
  • Start Here.
  • The basic TIGER files that are used for street mapping are free from the census bureau. Unfortunately, they are not as up to date or as detailed as those from the value-added companies that invest in improving the maps, and then sell them. Originally, the TIGER maps were created in a top-down manner by the Feds, and updated every ten years for the census. We are moving towards a bottom-up compilation process, where all construction projects would be submitted to county government in a CAD/GIS format, and the change would then percolate up and quickly be released to the public. So Free(tm) personal road map software is possible, and will be more competitive with the proprietary products. It will still be monster sized, especially if images are included, but look for new and improved soon.
  • Why give props to MS for their flightsim, when there is a great open flightsim [flightgear.org] available? Flightgear generates its scenery from terrain maps generated by the U. S. Geological Survey.
  • by Tappah (224124) on Friday April 13, 2001 @05:30PM (#292996)
    Over the past several years, I can't count the number of times I've seen people gush on and on about GIS and all the wonderful things it can do. I've watched dozens of Companies and Cities sink millions into attempts to develop functional GIS systems. In fact, most failed miserably. Although in some limited cases GIS can be a workable system, in many cases, GIS is simply a scam touted by consultants, and the computer-ignorant. The most frequent victims of this scam, have been City and County governments.

    The reason, is simple.

    Most beginning projects focus on the purchase of GIS software, like ESRI's ArcInfo, the purchase of sexy machines to run it, and perhaps some initial staffing to set it up. They almost uniformly massively underestimate the amount of time and money required to actually get a functional landbase into the system, attach data to it, and ensure the necessary level of accuracy of the basemaps.

    "It'll be easy, we have DB2 running on an as/400 and lots of maps" - should qualify as one of the most expensive single instances of "famous last words" in computer history.

    In reality, very few maps are accurately digitized. Most GIS systems get sold on the premise of scanning and vectorizing dead-tree bluelines, or other formats - from paper. the problem is, that geographic maps are finicky about scale, and the maps you *thought* were accurate inevitably need to be rectified to whatever coordinate system you chose for the system. Its a tedious and enormously time-consuming process that simply doesn't lend itself to any sort of automatic processing. Then, there's the problem that most maps simply aren't accurate to begin with, and you see the beginnings of a problem. "I just bought half a million dollars of GIS software and equipment, and now I discover it's going to cost me three or four million to get my maps in shape to actually be useful".

    Picture your average City government. Municipalities are where government really is. About 90% of the civil services provided to you by the sum of all government influence on you, comes from the local level. Cities have huge quantities of maps for things like building plats, subdivision maps, building blueprints, deed records, thoroughfare maps, land-use plans, etc etc etc. And that, coincidentally is where GIS appears to be the most useful. managing all that data is hard for cities - and GIS certainly looks easy in the presentations companies and consultants present - just point and click! What could be easier?

    But the numbers of abandoned systems started by cities who bought into these sales pitches are huge - on the order of billions of dollars worth.

    GIS companies and consultants know this - yet never warn purchasers, that in most cases, the cost of the software and systems amounts to only a tiny fraction of the actual costs of developing a working GIS.

    So when you hear all the gushing success stories, and gee-whiz ideas presented as though GIS was a wonder cure for all sorts of problems, try to bear in mind - GIS works best with single-source maps, and data which can easily be applied to pre-existing points. If you hear the words "scan" or "address-range" - you'll know a bullshit artist is at work. Because for every working system involving simple maps with pre-rectified sources, there are probably 15 that were simply abandoned because of enormous unexpected costs and (salesman provoked) unrealistic expectations. A lot of money has been pissed away on GIS over the years. Real computer professionals know the golden rule - flashy graphics doesn't equal useful purpose.

  • I have to respectfully disagree... it already changes the way we live although most folks don't know it. I work for a firm that specializes in arial mapping (Digital Terrain Models, Planametric Data Maps, and Digital Orthophotos) and most of our clients are state DOTs (dept of transportation) and city planners. The type of GIS data we deal in is used for things like road planning, city growth tracking, landfill tracking, microwave transmission evaluation... etc. Almost any urban development related subject has ties to the GIS field. Now if we could just find some surveyers that could read a GPS unit correctly 3 out of 4 times we'd really have something... :)
  • It's the link placings that really does it I think- over the word "huge" for example :)
  • VTK has some nice wrappers for various languages.
    I spent some time playing with the TCL/TK side of it, and it was great for quickly trying out ideas.
    Another package that comes to mind is VIS5D.
    See Somewhere here [kachinatech.com]
  • by antarctican (301636) on Friday April 13, 2001 @02:54PM (#293000) Homepage
    This might be slightly off exactly what's being asked for, but one data visualization product that's now on the market is called VisualNet (http://antarcti.ca). [antarcti.ca] It's designed to help users navigate large stores of information in a more meaningful manner.

    The other cool thing about it is the protocols and data formats it uses to transfer information in both it's 2D and 3D mode are publically available. The idea is in the same way Palms are popular because anyone can develop for them, that's the goal with this product.

    In fact, the 3D client is Open Source!

    I hope this might be some where close to the kind of tool you're looking for.


    antarctican at trams dot ca

  • Check out GRASStep [mac.com], a new project to bring GRASS GIS to Mac OS X and GNUStep [gnustep.org]. My motivation behind this project is to get out from under ESRI's bloatware thumb (mind you, I have just about everything they put out at academic price: $600 instead of closer to $30k+ for the sucker on the street).

    I also have some slick cartograms [unl.edu], which make even boring economic/demographic data seem cool.

    andy a.

  • WTF is wrong with the mods these days? For crying out loud...

    Ok now, listen. Use some Christian Humility and consider the notion that perhaps you're the one who's missing something. Ever since Archimedes, Leonardo, Newton, and Einstein, scientists and techies have been using pictures to understand and communicate ideas and further their thinking. Whether they do it on parchment or on a computer is irrelevent. Your "problem" does not exist : pictures and imagination go hand in hand.

    I think you are confusing mysticism with an altogether different brand of thinking.

  • Please don't use direct links into the VTP site!

    Use the proper page [vterrain.org] to request a VTP distribution.

    Thanks,
    Ben Discoe
    Project Manager, VTP
  • Actually, Yes. The fossil fuel industry is the biggest proponent and developer of underground terrain visualization. They have huge, multi-million dollar visualization facilities dedicated to interactive 3d rendering of subterranean geological features.

    If Dubya has half a clue, he should be just as enthusiastic as Gore about virtual terrain.

    -Ben
  • DEMster? Yes, well, here are the possibilities: Global Data Sharing/Referencing [vterrain.org]

    Actually Napster would a great analogy to follow, most of the terrain data in the world is proprietary and expensive, tightly controlled by the governments and big companies that produce it. Not that i'm suggesting anything , but P2P would be interesting way around that situation. The data only has to leak once for the genie to get out of the bottle..

    -Ben>

  • Even though your average chemical engineer can "visualize" the structure of a molecule from it's shell valiances, and member atoms, having 3d haptic controll of molecule "docking" is way cooler (and more usefull in the long run, by allowing for "intuition"). The nifty thing about this kind of project, is that it gives a cross disciliplinary access to a whole metric butt ton of data. I am envisioning something along the lines of the "EARTH" service in Neil Stephenson's Snow Crash. Where you can have as much detail, or as little as you want, and slice multi-dimensional slabs of realtime data (like the price of pimmentos, the weather, and the GNP of Zimbabwe).
  • It seems to me that people are missing something here. 'Visualisation' is all very well and good, but the problem with advanced graphics technologies to enable data visualisation is that it kills imagination.

    When Einstein developed his theories of relativity, he did so by using his mind to visualise data.

    By doing this he was forced to understand the data. The dificulty was the aid.

    Although it is difficult for many of us to see things, such as The Lord, the fact is that after some proper imagination and hard work, anything is possible.

    The problem with society today is that we expect things to be too easy. Things should be hard. When Jesus went into the wilderness, he did it without oxygen tanks, PDA's and GPS devices and thermos flasks. He wore only a robe.

    We should have a similar attitude. We may have to bear the pricks and cuts of hardship in doing so, but it is better to wear a Crwon of Thorns than one of gold.

  • You're going to have to try harder than that.
  • We definitely need to learn to be better thinkers. A lack of imagination is always filled with a surplus of visualization. Visualization is a crutch for analytical thought. The brain atrophies and become dependant on visualization. Look at how many Americans watch TV everyday. We're bred to be mindless drones by corporate society. When was the last time you came up with a joke that wasn't off TV, or said something that wasn't based on some visual imagery you saw off a movie? The lack of abstract, original thoughts have been replaced with media imagery.

    Of course, what I said above isn't original at all. Huxley, Orwell, Chomsky, adbusters.com, etc. will all regurgitate the same thing. But at least in the days before TV and computers, people had to form their own visualization of what they read in books and heard on the radio...


    You finding Ling-Ling's head?
    Someone come into yard, kill dog.
  • I've spent the last year developing a high volume web-based map server using Open Source tools. Note that the emphasis of this project has been vector based data (streets, rivers, shorelines, etc.), not image or raster data, so this skews my views somewhat. Sorry for the length of this post, but this really only scratches the surface of this topic.

    The two best free tools I found for manipulating map data and producing maps are GRASS (www.geog.uni-hannover.de/grass/ [uni-hannover.de]) and OpenMap (openmap.bbn.com [bbn.com]). GRASS certainly wins hands-down for its ability to read various file formats (including ESRI Shape Files and E00 Files), but its interface is somewhat ... odd ... and I've found it very buggy when dealing with vector products. OpenMap is a very nice Java application and library that can do some very slick graphics and handle many different projections. However, because it is written in Java, it's ability to scale to the level that I needed (random access street level maps being produced in several seconds) is practically non-existant. Nevertheless, if you are looking for a "higher level" of mapping tool, OpenMap is probably the tool you are looking for.

    I've also looked (somewhat superficially) at the major commercial mapping programs, produced by ESRI (www.ersi.com [esri.com]) and MapInfo (www.mapinfo.com [mapinfo.com]). At prices starting at around $25000 and rapidly going up, you'll certainly need a lot of money to get into this game.

    On the data side, there's a lot of data available on the net, some of it very good, and some not so good. Finding it is tricky, but it can be done. The "Digital Chart of the World" (DCW) is available from (HREF) and provides vector outlines for all the countries in the world (circa the early '90s). Its North American utility is somewhat limited, as the lat/lon points used in the vector outlines are based on NAD27, rather than the more popular NAD83 datum. The TIGER Line Files (HREF) is an excellent source of street level data (and state and county outlines, and much more) for the United States and various territories. Once the format of this data is understood, it's fairly easy to convert the data to a more usable format. Unfortunately, there doesn't appear to be much in the way of the tools out there on the net for working with this data.

    There's very little free street level data available outside the United States. This is an area crying out for an Open database, as the non-free data sources are really expensive and generally involve nasty royalities.

    I have been working on a the Onamap.com project for the last year. The primary purpose of Onamap is to provide a "where is it" tool for the Internet in which anyone can enter location data, commentary, etc.. The components of this project are:
    - a common text-based file format for vector data;
    - tools for converting TIGER, ESRI Shape Files, and DCW to this common format;
    - the location web server, written using Apache, mod_python, and MySQL; and
    - a high volume map server, written using Apache, jserv, Java, and C++.

    The map server is fairly new technology, and the components written in Java (mainly the rendering engine utilizing Java2D) need to be rewritten in something more efficient. If you want to see samples of the maps that this engine can produce, go to http://www.onamap.com/sample [onamap.com].

    The plan is to release this software to the public (down to the source level) in the next few months, after the code is cleaned up, debugged and documented. If anyone is interested in this project, please feel free to mail me at dpjanes@sympatico.ca [mailto].

  • Take a look at www.geographynetwork.com. ESRI, I like to call them the Microsoft of GIS, is trying to be the Napster of GIS. Of course you have to buy their software but they are putting a lot of stock in the P2P model. And they are thinking one step further. Why bother downloading the stuff, just use it as is, right of the web. I think the fact that a thread on spatial data is on slashdot just goes to show how much geospatial data a ready to explode. I don't know about visualization but maps will become pervasive in all types of applications. And not just at realtor.com, mapblast.com or you county assessor's web page but anywere the location of you data is valuable information.
  • But I tell you, nothing sells to suites better than those dopey zoomable-layered maps

    Isn't that a lot of what this industry is about. Why did we go from mainframe to windows - so we could point and click and scroll and look at nice graphics. I'm not so sure the public really wants a list. The challenge will be in designing a map centric site that is intuitively navigable and provides all the information needed. Oh yeah, and they will want to add their own data to it without knowing how, just push a button and whalla, instant map (I guess that's where web agents will come in). Even though a list would give them the same information.

Do molecular biologists wear designer genes?

Working...