The GIS software and analysis tools that an individual, group or corporate body chooses to use will depend very much on the purposes to which they will be put. There is an enormous difference between the requirements of academic researchers and educators, and those with responsibility for planning and delivery of emergency control systems or large scale physical infrastructure projects. The spectrum of products that may be described as a GIS includes (amongst others):
|•||highly specialized, sector specific packages: for example civil engineering design and costing systems; satellite image processing systems; and utility infrastructure management systems|
|•||transportation and logistics management systems|
|•||civil and military control room systems|
|•||systems for visualizing the built environment for architectural purposes, for public consultation or as part of simulated environments for interactive gaming|
|•||land registration systems|
|•||census data management systems|
|•||commercial location services and Digital Earth models|
The list of software functions and applications is long and in some instances suppliers would not describe their offerings as a GIS. In many cases such systems fulfill specific operational needs, solving a well-defined subset of spatial problems and providing mapped output as an incidental but essential part of their operation. Many of the capabilities may be found in generic GIS products. In other instances a specialized package may utilize a GIS engine for the display and in some cases processing of spatial data (directly, or indirectly through interfacing or file input/output mechanisms). For this reason, and in order to draw a boundary around the present work, reference to application-specific GIS will be limited.
A number of GIS packages and related toolsets have particularly strong facilities for processing and analyzing binary, grayscale and color images. They may have been designed originally for the processing of remote sensed data from satellite and aerial surveys, but many have developed into much more sophisticated and complete GIS tools, e.g. Clark Lab’s Idrisi software; MicroImage’s TNTMips product set; the ERDAS suite of products; and ENVI with associated packages such as RiverTools. Alternatively, image handling may have been deliberately included within the original design parameters for a generic GIS package (e.g. Manifold), or simply be toolsets for image processing that may be combined with mapping tools (e.g. the MATLab Image Processing Toolbox). Whatever their origins, a central purpose of such tools has been the capture, manipulation and interpretation of image data, rather than spatial analysis per se, although the latter inevitably follows from the former.
In this Guide we do not provide a separate chapter on image processing, despite its considerable importance in GIS, focusing instead on those areas where image processing tools and concepts are applied for spatial analysis (e.g. surface analysis). We have adopted a similar position with respect to other forms of data capture, such as field and geodetic survey systems and data cleansing software — although these incorporate analytical tools, their primary function remains the recording and georeferencing of datasets, rather than the analysis of such datasets once stored.
For most GIS professionals, spatial analysis and associated modeling is an infrequent activity. Even for those whose job focuses on analysis the range of techniques employed tends to be quite narrow and application focused. GIS consultants, researchers and academics on the other hand are continually exploring and developing analytical techniques. For the first group and for consultants, especially in commercial environments, the imperatives of financial considerations, timeliness and corporate policy loom large, directing attention to: delivery of solutions within well-defined time and cost parameters; working within commercial constraints on the cost and availability of software, datasets and staffing; ensuring that solutions are fit for purpose/meet client and end-user expectations and agreed standards; and in some cases, meeting “political” expectations.
For the second group of users it is common to make use of a variety of tools, data and programming facilities developed in the academic sphere. Increasingly these make use of non-commercial wide-ranging spatial analysis software libraries, such as the R-Spatial project (in “R”); PySal (in “Python”); and Splancs (in “S”).
Sample software products
The principal products we have included in this latest edition of the Guide are included on the accompanying website’s software page. Many of these products are free whilst others are available (at least in some form) for a small fee for all or selected groups of users. Others are licensed at varying per user prices, from a few hundred to over a thousand US dollars per user. Our tests and examples have largely been carried out using desktop/Windows versions of these software products. Different versions that support Unix-based operating systems and more sophisticated back-end database engines have not been utilized. In the context of this Guide we do not believe these selections affect our discussions in any substantial manner, although such issues may have performance and systems architecture implications that are extremely important for many users. OGC compliant software products are listed on the OGC resources web page: http://www.opengeospatial.org/resource/products/compliant. To quote from the OGC: “The OGC Compliance Testing Program provides a formal process for testing compliance of products that implement OpenGIS® Standards. Compliance Testing determines that a specific product implementation of a particular OpenGIS® Standard complies with all mandatory elements as specified in the standard and that these elements operate as described in the standard.”
Suppliers should be able to provide advice on performance issues (e.g. see the ESRI web site, "Services" area for relevant documents relating to their products) and in some cases such information is provided within product Help files (e.g. see the Performance Tips section within the Manifold GIS help file). Some analytical tasks are very processor- and memory-hungry, particularly as the number of elements involved increases. For example, vector overlay and buffering is relatively fast with a few objects and layers, but slows appreciably as the number of elements involved increases. This increase is generally at least linear with the number of layers and features, but for some problems grows in a highly non-linear (i.e. geometric) manner. Many optimization tasks, such as optimal routing through networks or trip distribution modeling, are known to be extremely hard or impossible to solve optimally and methods to achieve a best solution with a large dataset can take a considerable time to run (see Algorithms and computational complexity theory for a fuller discussion of this topic). Similar problems exist with the processing and display of raster files, especially large images or sets of images. Geocomputational methods, some of which are beginning to appear within GIS packages and related toolsets, are almost by definition computationally intensive. This certainly applies to large-scale (Monte Carlo) simulation models, cellular automata and agent-based models and some raster-based optimization techniques, especially where modeling extends into the time domain.
A frequent criticism of GIS software is that it is over-complicated, resource-hungry and requires specialist expertise to understand and use. Such criticisms are often valid and for many problems it may prove simpler, faster and more transparent to utilize specialized tools for the analytical work and draw on the strengths of GIS in data management and mapping to provide input/output and visualization functionality. Example approaches include: (i) using high-level programming facilities within a GIS (e.g. macros, scripts, VBA, Python) – many add-ins are developed in this way; (ii) using wide-ranging programmable spatial analysis software libraries and toolsets that incorporate GIS file reading, writing and display, such as the R-Spatial and PySal projects noted earlier; (iii) using general purpose data processing toolsets (e.g. MATLab, Excel, Python’s Matplotlib, Numeric Python (Numpy) and other libraries from Enthought; or (iv) directly utilizing mainstream programming languages (e.g. Java, C++). The advantage of these approaches is control and transparency, the disadvantages are that software development is never trivial, is often subject to frustrating and unforeseen delays and errors, and generally requires ongoing maintenance. In some instances analytical applications may be well-suited to parallel or grid-enabled processing – as for example is the case with GWR (see Harris et al., 2006).
At present there are no standardized tests for the quality, speed and accuracy of GIS procedures. It remains the buyer’s and user’s responsibility and duty to evaluate the software they wish to use for the specific task at hand, and by systematic controlled tests or by other means establish that the product and facility within that product they choose to use is truly fit for purpose — caveat emptor! Details of how to obtain these products are provided on the software page of the website that accompanies this book. The list maintained on Wikipedia is also a useful source of information and links, although is far from being complete or independent. A number of trade magazines and websites (such as Geoplace and Geocommunity) provide ad hoc reviews of GIS software offerings, especially new releases, although coverage of analytical functionality may be limited.