The Obama administration named two information technology developers and one IT company on Thursday to receive the nation's highest honor for inventors.

The administration will award the National Medal of Technology and Innovation to the winners -- five total -- during a White House ceremony on Oct. 7. Traditionally, the White House does not announce the contributions of the winners until close to the ceremony. But, according to IBM, which will receive a medal, the company will be honored for development of its Blue Gene line of supercomputers, which typically are ranked among the fastest and most powerful computers in the world. Two inventors, John Warnock and Charles Geschke, also will be honored for founding the software firm Adobe Systems. Both IT companies have made their mark with tools that the federal government increasingly depends on for collaboration across agencies.

Federal information technology executives view the awards as part of Obama's innovation agenda, an effort to use technology to improve government performance. "I see a lot coming on the innovation agenda. The first sign of that is the appointment of a chief technology officer and a governmentwide chief information officer," said Alan Chvotkin, executive vice president and counsel at the Professional Services Council, a contractor industry group. Federal Chief Information Officer Vivek Kundra and federal Chief Technology Officer Aneesh Chopra are filling roles that have never existed at the White House level.

"Of course, Vivek is speaking everywhere to everybody," he added, referring to the CIO's numerous appearances at conferences and on the White House Web site to promote collaborative Web applications aimed at transforming the way the government operates. The tools include Data.gov, a depot of federal statistics that allows third-parties to download and manipulate information to meet customers' needs, and Apps.gov, an online storefront where agencies can purchase subscriptions to Web services or cloud computing applications.

But many of the initiatives have yet to trickle down to the agency level, Chvotkin noted. For example, guidance on President Obama's plan to shift agency IT environments to the Web, or the cloud, will come in the fiscal 2011 budget.

Unlike Kundra, Chopra has yet to publicize his work. "He doesn't say as much about what he's doing out there publicly, but I know he's been actively involved in the internal discussions," Chvotkin said. "If you ask me three things that he's done, I couldn't tell you."

The 2009 technology medal winners have been instrumental in the effort to change how government does business, according to the honorees. IBM's speedy computers model scenarios that help protect the nation's nuclear arsenals and predict climate trends. The company says the Blue Gene systems also are eco-friendly because they perform at processing speeds that otherwise would require a dedicated power plant large enough to supply electricity to thousands of homes.

Warnock and Geschke founded Adobe, a major federal supplier, in 1982. They met in the late 1970s while researching graphic systems and printing at the Xerox Palo Alto Research Center. To bring their ideas to market, Warnock and Geschke launched a company committed to transforming text and images on a computer screen into exact and attractive print reproductions. The firm released Adobe Portable Document Format, which enabled users to exchange images of digital documents across computing platforms. The format is now the default standard for many government agencies. Federal workers use other Adobe desktop and server products to collaborate with colleagues and the public.

The other two technology honorees are Forrest Bird, who developed mechanical breathing devices, and Esther Sans Takeuchi, who researches micropower sources at the University at Buffalo.

Obama also named nine researchers on Thursday as recipients of the National Medal of Science, which honors scientists and engineers.

A new report released by the American Library Association (ALA), "Libraries Connect Communities 3: Public Library Funding & Technology Access Study 2008-2009,"shows that public libraries, even as they aim to increase access to computers and the Internet, are struggling to keep up with surging demand during the economic downturn.

Broadband funds from the recent stimulus package may provide a boost, but the authors of the report also encourage libraries to do a better job explaining to the public the services they offer.

“The public library community needs new models for deploying and managing public access technology—especially around broadband. Increasing broadband at the front door may not always provide significant increases at the workstation,” said Charles McClure, co-principal investigator for the study and director of Florida State University’s Information Institute. “Strategies may include expanding the role of consortia and increasing community and government partnerships to leverage economies of scale and meet community needs in concert.”

In addition to ALA, the study was conducted by the Center for Library & Information Innovation at the University of Maryland (UMCP) and the Information Institute at Florida State University (FSU). It was funded by the ALA and the Bill & Melinda Gates Foundation.

Crucial role
With more than 71 percent of all libraries (and 79 percent of rural libraries) serving as the only source of free Internet access, libraries remain crucial to those seeking jobs and needing to connect to e-government—indeed, at least five states require that those filing for unemployment insurance must do so online.

The study says that 14.3 percent of public libraries decreased their operating budgets in FY2009. Only 38 percent of libraries report budget increases at or above the rate of inflation. More than half (53 percent) of the state library agencies that provide state funding to public libraries report declining state funding in FY2009, according to questionnaires to the Chief Officers of State Library Agencies (COSLA).

Capacity challenges
The report finds that more than 81 percent of libraries say they have insufficient availability of workstations some or all of the time. More than 94 percent of libraries have imposed time limits on their workstations.

Some 77.4 percent says that cost factors limits their capacity to add public access workstations/laptops, while 75.9 percent cite space limitations and 34 percent report the inadequacy of their building infrastructure.

Some 38.2 percent of public libraries don’t have a replacement or addition schedule for their public access computers.While 42.3 percent of libraries—and 72.2 percent of urban ones—support their IT with system-level IT staff, only 28.7 percent of rural libraries have access to such support.

Connection speed
Nearly 60 percent of libraries say Internet connection speeds are insufficient to meet needs at some point in the day—a slight increase from 57.5 percent in the previous survey, released last year.

Some 44.5 percent report Internet connection speeds greater than 1.5Mbps, a significant rise from 2007–2008 (25.7 percent), but the increasing amount of high-bandwidth applications continues to strain libraries. Moreover, about one-third of rural libraries have connection speeds less than 1.5 Mbps, compared with 7.1 percent of urban libraries, and 16 percent of suburban public libraries.

While 23 percent of libraries state cost kept them from improving bandwidth, even more (26 percent) say increased access simply isn’t available.

More than 76 percent now offer free Wi-Fi access, up from 66 percent last year.

Hardware questions
For the first time in the history of the multi-year survey, libraries said they had fewer new (less than one-year-old) public access computers, and 61 percent of libraries reported no plans to add computers in the coming year.

A decline in technology spending anticipated for FY2010 could mean that the decrease in computers in fall 2008 (when the survey was completed) may continue into next year, the report says.

Looking ahead
Data from the next study will be collected this fall, as the recession continues to limit budgets, and libraries likely get a piece of the $7.3 billion in broadband stimulus funds, notably the $200 million—at minimum—set aside for public access centers, including public libraries.

The authors suggest a number of actions to help improve the public library’s public access computing and information technology infrastructure. They advise libraries to:

* Document the range and extent to which public access computing services, resources, and programs are used.
* Increase local community awareness of the importance of the public library and Internet-based services.
* Engage in a carefully developed assessment of broadband capacity needs and develop a plan to obtain and use additional capacity.
* Establish a plan to document the impacts and outcomes from [stimulus-funded] broadband capacity increases.
* Rethink delivery and organization of public access computing services, resources and programs.

“This rethinking process includes expanding the role of consortia and increasing collaborations and partnerships that can better leverage economies of scale, while maintaining or increasing the quality of network-based services,” the report states. “Examples include cooperative broadband purchasing or a statewide e-government Web portal of resources, services, training and related programs. Such a Web portal could be jointly developed among public libraries, state and local government that would be available to all public libraries in the state, rather than developed piecemeal by individual libraries."

lhavo, Portugal – 18/09/2009 – SADIF Investment Analytics, announces a new summary due diligence report covering Wiz Information Technology Co., Ltd. (38620). The report uses SADIF's powerful StockMarks™ stock rating system and contains important analysis for any current or potential Wiz Information Technology Co., Ltd. investor.

Report Summary: Wiz Information Technology Co., Ltd. is an above average quality company with a
neutral outlook. Wiz Information Technology Co., Ltd. has strong business growth and is run by efficient management. When compared to its closest peer, Kukdong Oil & Chemicals Co., Ltd., Wiz Information Technology Co., Ltd. shows greater undervaluation and is equally likely to outperform the market.

The 8-page report breaks down the Total StockMark into its three components – business, management and price, performing an in-depth analysis of Wiz Information Technology Co., Ltd. for long-term investors.

The report has been distributed to Reuters, and forwarded to Yahoo Finance and FT.com. It is available under 'Analyst Reports' from these websites, from multiple professional platforms including Reuters Knowledge, TheMarkets.com, Thomson Research and Capital IQ or directly from SADIF-Investment Analytics at:
www.sadifanalytics.com/stockmarks/company.php?ticker=38620&cod_c ..

About SADIF-Investment Analytics:
SADIF-Investment Analytics is an independent investment research company covering sixteen different markets and over 12,000 companies. The StockMarks™ system is based on proven investment principles and is designed to drive long-term shareholder returns.

President Obama's address to Congress on health-care reform overlooked one of the most important issues: the poor state of health information technology.
Last week, a 62 year old woman, whom we will call Mrs. B, came into our office complaining of shortness of breath. She also mentioned a history of severe hypertension, coronary artery disease and dialysis-dependent kidney failure. We discovered that she had been admitted several times in the past year to five different area hospitals. Beyond these bare facts, we had no other information. We had no reliable details of her recent testing, treatment or medications. Also, she could not recall the names or dosages of her sixteen pills, and she knew that she was severely allergic to a certain heart medicine, but she couldn't remember its name, either. We were understandably reluctant to prescribe new medications or therapies without obtaining her recent records.

Mrs. B's situation is all too common. Information is fragmented and not readily accessible. Even the most prepared patient carrying copies of previous medical records is handicapped by the difficulty in deciphering handwriting and medical notations. It is common for duplicate tests to be ordered, increasing health-care costs by perhaps 15 percent or more.

Most currently available electronic medical record software is unwieldy and difficult to quickly access, and there is still no vehicle for the timely exchange of critical medical data between providers and facilities. The stimulus bill included $50 billion dollars to promote uniform electronic record standards, but it will be difficult and costly to construct new systems ensuring interoperability of all current hospital software.
A cheaper and more effective solution is to adopt a standard electronic record-keeping system and ask that all health information software interface with it. In fact, a proven system already exists. The software is called the Veterans Health Information Systems and Technology Architecture (VistA), which the Veterans Affairs Department developed. VistA requires minimal support, is absolutely free to anyone who requests it, is much more user-friendly than its counterparts, and many doctors are already familiar with it.

Numerous scientific studies have concluded that using the system leads to fewer medication and allergy errors, increased utilization of cost-effective preventive-care measures, and decreased duplicate testing. In fact, VistA has been so successful that it is now used by numerous organizations, states and 13 countries worldwide.

If VistA had been available in our office, Mrs. B could have mentioned her previous treatment providers and given her express permission to access the records. We would have contacted her previous institutions electronically and obtained a complete medical background within seconds, rather than spending several hours questioning her and calling her previous providers.

The Obama administration's push to create an electronic patient record for every American has gained steam in Washington, with billions of dollars expected to be spent over the next five years.

But in Maryland, the process is ahead of schedule.

That's because Maryland's three largest hospital systems and a large retirement community operator are building a statewide information exchange network that could be up and running before any federal network. The exchange - Chesapeake Regional Information System for Our Patients, or CRISP - was approved for $10 million in start-up state funding. Its purpose: to let hospitals, insurance providers and health care professionals freely and securely share information about the patients that come through their doors.

"For doctors who don't have a prior record, it could be real helpful to get the discharge summary from the hospital down the street, which can bring them up to speed very quickly on a patient," said David Horrocks, president of CRISP.

A piece of the pie
The focus on health information technology is creating a boon for technology companies nationwide who are seeking a piece of the multibillion-dollar pie. In Maryland, several companies have expressed interest in helping to build the state's network, according to officials familiar with the process.

Proponents of moving to an electronic record format say it makes sense for the patient, whose records and treatment history could theoretically be accessed at any hospital or doctor's office. Electronic medical records can be more efficient for medical staff and patient tracking and billing, helping to reduce the clerical work needed to maintain large filing systems.

For one, hospitals and insurance companies hope that easily accessible records will eliminate the need for duplicative and costly diagnostic tests.

"Health care represents some of the most advanced digital technology humankind has ever created," said Todd Johnson, president of Fells Point-based Salar Inc. "But the information flow is often very choppy and obsolete…. Hospitals are more and more ready to tackle some of these hurdles."

With nearly 20 employees and a 10-year track record, Salar makes software that enables physicians, nurses and other medical staff to input their notes directly into a database that essentially creates "electronic paper" that's easily managed by its users. The software fulfills the electronic physician documentation requirement that, at the national level, is scheduled to take effect in 2013.

"On the one hand, that's four years from now," said Johnson, whose company's revenues are up 30 percent in the past year and has been hiring recently. "On the other hand, it's right around the corner."

But building such a system, particularly one that's accessible nationally, involves at least two big hurdles: cost and security. Historically, doctors and hospitals have been reluctant to spend money on electronic systems with no immediate benefit in sight. And the need for tight online security of electronic patient records is of paramount concern for the public.

As part of the $787 billion economic stimulus plan, about $19 billion is being set aside to help the nation's hospitals and physicians' offices make the transformation by 2014. Last month, Vice President Joe Biden announced a $1.2 billion early injection into the effort. Half of the funding will go toward establishing centers that will help hospitals manage the technical aspects of upgrading their systems, while the rest will facilitate information-sharing between hospitals across the U.S.

Before billions of dollars can be spent nationwide, the federal government and industry groups are still hammering out a consensus of what such a national network will look like, how it will function and how secure it will be. Building a national electronic patient record exchange network is a major challenge, fraught with varied technology hurdles and steep financial costs, with a need for uniform standards across a patchwork of states, industries, and public and private medical providers.

According to a study published in the New England Journal of Medicine in March, only 1.5 percent of nearly 3,000 non-federal hospitals in a survey have a comprehensive electronic health record system. Estimates range widely as to the expected annual savings if patient records move completely online, but it could be as high as $77 billion.

In Maryland, the climate appears ripe for building a network that would address the needs of patients, doctors and hospitals in the state. A House bill passed this year sets up a framework for giving hospitals and physicians financial incentives to implement electronic patient records systems by 2015. The Maryland Hospital Association, which represents 67 hospitals and advocated for the bill, would require state-regulated insurers to pass on incentives to hospitals and physicians so they can upgrade from paper to electronic records.

"Ultimately, electronic health records are going to enhance patient care, and bottom line, that's what hospitals want," said Jessica Jackson, a spokeswoman at the association.

Maryland's three largest hospital systems - Johns Hopkins Medicine, MedStar Health and the University of Maryland Medical System - along with Erickson Retirement Communities, are leading the effort to build CRISP. CRISP was approved for $10 million in state funding to start building the exchange over the next two to five years.

Horrocks, the CRISP president, said the exchange will likely be built before any federal network is built. Nevertheless, the goal is to make the state network compatible with the federal one.

President Obama Honors IBM's Blue Gene Supercomputer With National Medal of
Technology and Innovation
Eighth time IBM has received nation's most prestigious tech award; Blue Gene
has led to breakthroughs in science, energy efficiency and analytics



WASHINGTON, Sept. 18 /PRNewswire-FirstCall/ -- President Obama recognized IBM
(NYSE: IBM) and its Blue Gene family of supercomputers with the National Medal
of Technology and Innovation, the country's most prestigious award given to
leading innovators for technological achievement.

(Photo: http://www.newscom.com/cgi-bin/prnh/20090918/NY78058 )

(Logo: http://www.newscom.com/cgi-bin/prnh/20090416/IBMLOGO )

President Obama will personally bestow the award at a special White House
ceremony on October 7. IBM, which earned the National Medal of Technology and
Innovation on seven other occasions, is the only company recognized with the
award this year.

Blue Gene's speed and expandability have enabled business and science to
address a wide range of complex problems and make more informed decisions --
not just in the life sciences, but also in astronomy, climate, simulations,
modeling and many other areas. Blue Gene systems have helped map the human
genome, investigated medical therapies, safeguarded nuclear arsenals,
simulated radioactive decay, replicated brain power, flown airplanes,
pinpointed tumors, predicted climate trends, and identified fossil fuels --
all without the time and money that would have been required to physically
complete these tasks.

The system also reflects breakthroughs in energy efficiency. With the creation
of Blue Gene, IBM dramatically shrank the physical size and energy needs of a
computing system whose processing speed would have required a dedicated power
plant capable of generating power to thousands of homes.

The influence of the Blue Gene supercomputer's energy-efficient design and
computing model can be seen today across the Information Technology industry.
Today, 18 of the top 20 most energy efficient supercomputers in the world are
built on IBM high performance computing technology, according to the latest
Supercomputing 'Green500 List' announced by Green500.org in July, 2009.

About IBM
For more information, visit http://www.ibm.com/smarterplanet

Note to registered journalists and bloggers: To view and download Blue Gene
b-roll in broadcast or streaming quality, please go to:
http://www.thenewsmarket.com/ibm.

Additional Blue Gene photos and other materials for journalists are available
in the IBM Press Room at http://www.ibm.com/press/us/en/presskit/28422.wss.

Contacts:

Michael Loughran
IBM Media Relations
914-945-1613
mloughra@us.ibm.com

Steven Eisenstadt
IBM Media Relations
914-766-8009
saeisens@us.ibm.com


SOURCE IBM

FAU has received a five-year grant from the NSF to create a site for the Center for Advanced Knowledge Enablement (CAKE) to provide a framework for interaction between university faculty and industry to pursue advanced research in these fields. Research that will be carried out by CAKE will fortify other areas of science and technology and provide new capacity for economic productivity.
The main objective of CAKE is to develop long-term partnerships among industry, academia and government, and will feature high-quality industry research, strong industrial support and collaboration in research and education, and direct transfer of university developed ideas, research, and technology to U.S. industry. FAU’s center will operate jointly with the FIU center formed last year.

“This is a win-win situation for both our university, and our industry and government partners,” said Dr. Borko Furht, chair, department of computer and electrical engineering and director of CAKE at FAU. “This grant provides FAU with the opportunity to conduct industrially relevant research, receive additional seed funding, and moreover, benefit from the recognition and prestige of being an NSF research center.”

Affiliation with and membership to FAU’s CAKE is open to industry, government agencies and others with research needs. The center will provide its partners with numerous benefits including early access to research innovations, and opportunities to interact and work with faculty, students and industry peers. It addition, the center provides a platform to leverage research and development investments with multi-university centers renowned for their innovative research capabilities. There are currently 50 NSF-sponsored industry/university cooperative research centers in the U.S.

“The new center represents the combined efforts of FAU and FIU researchers, and now has the critical mass to serve the information technology industry in South Florida and help it to mature into the top tier,” said Dr. Naphtali Rishe, director of FIU’s I/UCRC-CAKE and Inaugural Outstanding University Professor.

The research agenda for FAU’s CAKE includes new technologies for various Web-based applications, video compression and communication, next-generation hardware/software development techniques and tools for mobile devices, RFID-based automation systems, and others. Several companies have already made a commitment to join the Center, including LexisNexis, ProntoProgress, RealNetworks, PartnerCommunity, Motorola iDEN Mobile Devices, Vivaja Technologies, Ingenious Software, Ineoquest and Evolux Transportation.

“This center is a great opportunity for South Florida’s high-tech community,” said Jaime Borras, former corporate vice president of Motorola and current chairman of Mobile Technology Consortium, which consists of 50 high-tech companies in South Florida. “The objective to bridge academia, industry, and government by providing state-of-the-art and economic applied research through the center is something desperately needed in this region.”

FAU’s center is catalyzed by an investment from the NSF, which will range from $307,000 to $750,000 over five years and will be supported primarily by center members. The NSF will assume a supporting role in the development and evolution of the center and its members.

Members will have access to and benefit from the results of all of the Center’s projects funded by the membership fees. “The structure and format of the membership program creates a multiplier effect,” said Furht. “For example, a member paying a fee of $24,000 will be able to benefit from at least $300,000 worth of research.”

Through an intra-government transfer mechanism facilitated by the NSF, FAU’s CAKE can accept funds from any government agency and will only charge a nominal overhead fee. Funds can be directed to any aspect of CAKE’s research, including work that may be beneficial to its industrial members. Furthermore, funds can also be used for collaborative research with industrial members who serve as sub-contractors.

The research that will be conducted by CAKE is applicable to many fields, including defense, homeland security, healthcare and biomedical science, environmental science, finance and technology services.

“Through partnerships and collaborative research at our center, we can examine, identify and address the root of various problems or issues which are of interest to multiple industry members,” said Furht.

Co-principal investigators of FAU’s CAKE include Drs. Hari Kalva, associate professor; Abhijit Pandya, professor; Shihong Huang, assistant professor; Ankur Agarwal, assistant professor; and Ionut Cardei, assistant professor, in the department of computer and electrical engineering and computer science at FAU.

For more information, contact Dr. Borko Furht at bfurht@fau.edu or (561) 297-3486.

- FAU -

About Florida Atlantic University:
Florida Atlantic University opened its doors in 1964 as the fifth public university in Florida. Today, the University serves more than 27,000 undergraduate and graduate students on seven campuses strategically located along 150 miles of Florida's southeastern coastline. Building on its rich tradition as a teaching university, with a world-class faculty, FAU hosts ten colleges: College of Architecture, Urban & Public Affairs, Dorothy F. Schmidt College of Arts & Letters, the Charles E. Schmidt College of Biomedical Science, the College of Business, the College of Education, the College of Engineering & Computer Science, the Harriet L. Wilkes Honors College, the Graduate College, the Christine E. Lynn College of Nursing and the Charles E. Schmidt College of Science.

Get the latest news alerts: Follow LTW at Twitter.

Local Tech Wire

RALEIGH, N.C. – North Carolina has a new chief information officer.

Gerald Fralick, a veteran technology executive from Chapel Hill, is replacing George Bakolia, Gov. Bev Perdue announced Thursday.

Bakolia, the state’s CIO since 2002 and an appointee of former Gov. Mike Easley, will serve as senior deputy CIO, Perdue said in a statement. That deputy post had been vacant.

“Jerry and George have years of experience and North Carolina will continue to be a technology leader under their direction,” Perdue said in a statement.

Fralick, a former CIO at the Office of Justice Programs for the U.S. Department of Justice, is president and owner of JFCS. The company, a service disabled veteran owned business, provides marketing services to information technology firms.

A graduate of LeMoyne College in New York with a degree in economics, Fralick earned a Master’s degree in computers and management of IT at American University.

Bakolia was CIO of the North Carolina Department of Justice before taking the state CIO spot.

The N.C. CIO directs the N.C. Office of Information Technology Services.

Did you think the economic crisis of the last 18 months was going to pass without impacting the geospatial marketplace? Was there some element of naiveté that perhaps led us to expect that geospatial would be left unscathed by such disruptive financial upheaval, and that somehow growth would continue unabated? And, if one wanted to assess the effect of the vicissitudes of the economy, how accurate would those measurements be?

While there has been modest growth by geospatial technology solution providers, according to at least some recent market research reports (see Daratech GIS/Geospatial Survey) we may be entering a new era where it may be impractical to take the pulse of GIS because of an increasingly bifurcated market of sellers and buyers and inadequate market definitions. Where once we had the big four in geospatial (ESRI, Intergraph, MapInfo, Autodesk), there are now so many choices, including developer platforms (e.g. APIs) and open source end user and developer options, that it's difficult to keep score. The needs of the information technology community have moved from buying point solutions to a more customized approach to software, data and services. Specifically, Web 2.0 business models mandate new technology delivery platforms like mashups and cloud computing that accelerate development cycles, deliver solutions faster and reduce the total cost of ownership.

How, then, do we measure the market for geospatial technology and location intelligent solutions? Is geospatial technology considered merely part of the broader business analytics sector? If not, how are the geospatial technology providers perceived and measured by those conducting market research for the broader enterprise computing sectors? Is it even possible take the pulse of this increasingly fragmented market?

At Home Among the Other IT Giants?
For these answers, let's look at some recent reports from the leading market research firms, IDC, Daratech and Ventana Research, that are familiar with the geospatial and location intelligence technology sectors. IDC recently ranked (Source: Intelligent Enterprise, Aug. 24, 2009) companies in the business analytics market. The top ten included Teradata, Informatica and Microstrategy, names familiar to many in the data warehouse and business intelligence markets. "The top 10 vendors account for 66% of the software revenue from business analytics, leaving the remaining 34% a competitive battleground for hundreds of independent software vendors worldwide," IDC reported. But both ESRI and Intergraph were ranked just below the top ten. This serves as a data point. Perhaps geospatial technology is an ever-increasing necessity in decision making, and measuring business performance has found favor with the market researchers watching the industry, not to mention those doing the actual IT procurement.

"IDC has included spatial information vendors in its broad business analytics practice for several years," said IDC analyst David Sonnen when asked about the IDC report. "While companies like ESRI, Intergraph and PBBI do a lot more than just analytics, some of their software does include sophisticated analytic capabilities. So, IDC tracks spatial companies as part of the overall business analytics market. Also, most business analytics companies include spatial capabilities in their analytic platforms. It's hard to track exact revenue figures for features in a software platform, but IDC does think it's important to recognize the role that spatial capabilities play in enterprise markets." Spatial analytics is just one of IDC's nine business analytics segments, according to Sonnen. "IDC uses ‘business analytics' as a broad category that includes analytic applications, business intelligence tools, data warehousing platform software and spatial information analytics tools. We divide some of those categories into subsegments, like financial analysis software and CRM."
.
Describing how IDC is able to develop a consistent methodology of reporting on the spatial analytics market Sonnen said, "All the analyst firms have their own taxonomies for the markets they cover. Those taxonomies are somewhat arbitrary, but they give the analyst firms a consistent structure for tracking revenue and trends."

Numbers
The market growth numbers tell the real story of why geospatial is more than a blip on the radar screens of researchers. Daratech recently released its GIS/Geospatial Industry Report in which the company reported: "GIS/Geospatial industry worldwide growth is forecast to slow to 1%, down from 11% in 2008 and a whopping 17.4% in 2007... However, industry CEOs interviewed by Daratech were unanimous in their belief that growth consistent with the robust 11% CAGR (compounded annual growth rate) of the past six years would return in 2010." While these numbers reflect the current economic malaise, do they capture the market in total? Do they account for service revenue from those companies that have built applications with the aforementioned APIs and do they accurately assess the impact of another big four: Microsoft, Oracle, IBM and Google?

A Revolution in Geospatial for the Enterprise and the Masses
The "new" big four are now key players in geospatial analytics. In addition, there are so many consumer mapping products on the Web and on the shelves of retailers that these products have, in the words of Dave DiBiase of Penn State, "made geography...ordinary, which is the most extraordinary thing of all." This gang of four has revolutionized, popularized and educated business professionals in location intelligent solutions. The question now: Just how much has their influence changed the geospatial market and added to the overall market growth? The growth numbers associated only with sales of geospatially related technology products (e.g. Oracle Spatial) are hard to come by since those companies do not segment their geospatial lines of business.

Moreover, while traditional geospatial markets in local and state government as well as the regulated markets of telecommunication and utilities still buy complete GIS solutions, business analytics players like Information Builders, Netezza and others can also just as easily provide targeted solutions for whatever small part of the workflow requires a more location intelligent view of the world. While spatial is still "special" to many, solutions that provide geospatial analysis and data management are not relegated to just a few software companies anymore. So, can we still rely on the standard definitions of "traditional GIS" to accurately capture the size of the overall market?

"The scope of software that is location aware has grown geometrically in the past five years," says Charles Foundyller of Daratech, author of the most recent report issued by the company. "Nevertheless, there is still a need to understand the market presence, on a consistent basis, of the underlying technologies as represented by what we call Traditional GIS. In short, Traditional GIS are products and services related to computerized database management systems used for the capture, storage, retrieval, manipulation, analysis and display of spatial (or locationally-defined) digital data capable of responding to geo-relational queries."

In a Web 2.0 world that "mashes up" and leverages the "cloud" for storage, retrieval and management of geospatial data, the so-called traditional GIS vendors might find total disruption if they continue to follow the traditional model, as defined by Daratech. The new model that is emerging, especially with the proliferation of APIs, is one that rapidly develops targeted solutions on the fly and eschews thick client applications. "The use of geography and location related information and technology is a contributor of business innovation and breakthrough performance," said Mark Smith, Ventana Research's executive vice president of Research, in a 2008 report. The research in which Ventana has been engaged over the last several years has found "viral use of consumer mapping technology like Google in business, but organizations are challenged in utilizing it to deal with the volumes and frequency of data that needs to be integrated." It's that "viral use" that is fueling geospatial technology growth, and managing geospatial information is certainly solved by the database providers at this time.

We are at the threshold of a new business environment, both economically and technologically, that requires new ways to measure the growth of the market. The "viral" expansion of location-enabled solutions coupled with new data sources (see news about the upcoming DigitalGlobe WorldView-2 launch) make market research a near-impossible task. Though our market research friends must continue to work within their own framework and "taxonomies," the market's fluidity and cast of new players create challenges. Revenue from services is increasing as we see fewer desktop solutions delivered and more cloud computing solutions offered, for example. As such, some of the recent market research numbers may be incomplete. The scope of market research must expand if it is to capture the true breadth of the geospatial technology marketplace.