Apple’s weathered the economic downturn like almost no other company, pulling in massive profits without having to resort to deep price cuts. But what’s its secret?
Over the past few weeks I’ve been asking a group of Mac/Apple fans why they thought Apple was doing so well. Here are four possible reasons for Apple’s success:
* Constantly improving, innovative product line
Apple doesn’t have a huge product line, but the company is constantly tweaking it. Most of the product updates are well thought out and offer the end user extra value. This is attractive to both new and existing customers.
* Advertising
Apple has a massive ad budget and this allows it to spread the word to a huge audience. When this comes to Macs, this is allowing Apple to aggressively go after those disillusioned PC users. Those funny Mac vs. PC ads are serious business.
* Customer satisfaction
Apple consistently scores very high in customer satisfaction surveys (the company usually tops the list). Happy customers not only re-buy, but tell others that they are happy, therefore generating further sales.
* Vista sucked
Given how Vista was widely considered by many consumers to be a failure, this has handed users over to Apple on a plate. After all, for users who were sick of Vista, or just wanted to give the OS a wide berth, Apple is an obvious choice.
Tabs
- About Us (1)
- Asp.Net (8)
- C Sharp (7)
- Contact me (1)
- Earn Online (1)
- General News (1)
- IT News (475)
- Motivational SMS (4)
- SQL (1)
- Tricks and Tips (7)
- Trouble Shooting (13)
- VB.Net (10)
My Blog List
-
What is there in ones name! Here is a LOT!12 years ago
Blog Archive
Google now gives users of a subset of Google Voice features if they want to simply use their service without getting a brand new “Google Number”. Mobile phone users can now forward incoming calls to their “Google Voice” mailbox rather than the one hosted by their carrier.
The features available to these users basically includes voicemail, and the ability to make low-priced long distance phone calls, according to the official Google Voice blog.
* Online, searchable voicemail
* Free automated voicemail transcription
* Custom voicemail greetings for different callers
* Email and SMS notifications
* Low-priced international calling
For those of you who want to use Google Voice to its full potential — you’re going to have to take the plunge and get a brand new number. The features, if you choose to go this route, are much more interesting and useful:
* One number that reaches you on all your phones
* SMS via email
* Call screening
* Listen In
* Call recording
* Conference calling
* Call blocking
Once Google fully supports number portability, perhaps they will be able to allow users to simply use one of their existing phone numbers with the full service — but I’m not going to hold my breath for that just yet.
Will you be using the new voicemail service from Google Voice?
There is a common saying that only lawyers or solicitors should get acquainted with different aspects of law, they must be well informed or else the profession may come to an end. Can this be accepted as true? Shouldn’t common people get familiar with the same?
What is Information Technology Act 2000 (ITA-2000)?
Well, if you are living in India and are almost a PC freak, you must remain aware of Information Technology Act 2000 (ITA-2000). It, by and large, is an Act of the Indian Parliament (No 21 of 2000) notified on October 17, 2000. It is worthwhile to mention that the United Nations General Assembly by means of resolution A/RES/51/162, dated the 30 January 1997 did accept the Model Law on Electronic Commerce adopted by the United Nations Commission on International Trade Law. This is referred to as the UNCITRAL Model Law on E-Commerce. What is more the Information technology Act 2000 amendment Bill 2006 has since been passed by the Indian Parliament on December 23, 2008.
Is there any new proposal?
It has to be stated that the Government of India, by now, has proposed major amendments to ITA-2000 in form of the Information Technology (Amendment) Bill, 2006, passed by the Cabinet Committee of the Government of India and are prepared for being placed before the Indian Parliament for discussion.
Nevertheless some substantial developments have taken place in all these years and the bill is known as, at the moment, Information Technology (Amendment) Bill, 2008. Even though some persons try to discern any similarity with the 2006 Bill, it is a totally different Bill and has been approved by the Rajya Sabha and Lok Sabha. Is there any more problem? The Bill is awaiting approval of the President along with the formal notification.
There has been the inclusion of many changes, as already said, and at the same time it does incorporate the recommendations made by the Parliamentary Standing Committee. However there is another problem. In the Indian scenario till the moment a Bill gets finally notified by the Executive it remains a mere Bill. For this reason, till the government of India notifies it, the old Information Technology Act, 2000 would continue to preside over the Indian cyber law.
Has there been any criticism?
What surprises many persons is the dearth of media recognition afforded to the amendment. Apart from this the amendment was passed in 26 minutes the 22nd of December 2008 along with 4 other bills, and another 8 in just 17 minutes the next day. This indicates that there was hardly any debate on what should have been very contentious laws. Even Karnika Seth, renowned cyberlawyer & chairperson of the Cyberlaws Consulting Centre in India, through making a detailed analysis of the recent amendments in the IT Act, 2000 has failed to be optimistic.
And I don’t just mean for geeks. I mean a real, viable alternative to Windows for many users despite the apparent quality of both Windows 7 and Server 2008.
About a year and a half ago, ZDNet’s Adrian Kingsley-Hughes asked, “Is Ubuntu becoming the generic Linux distro?” and concluded that “the evolution of Ubuntu into the generic Linux distro isn’t a bad thing”. Fair enough, but Canonical’s Mark Shuttleworth took this idea a bit farther during a press conference call yesterday:
“We’ve already done a lot of work in developer ecosystem and we’re now increasingly interested in the non-developer consumer ecosystem, so that’s what all the OEM work is about,” Shuttleworth said, declaring that his focus was on “making sure that Ubuntu gets pre-installed and Ubuntu is available from Dell.com and others and making sure that Ubuntu is the default alternative to Windows.”
He didn’t mention Apple, which, to many consumers, is the only alternative to Windows. For all its buzz in the tech world, Linux (or Ubuntu) is hardly a household word. Competing with Apple, though, which already has an impressive ecosystem of hardware and is the reigning king of usability, doesn’t make sense anyway and this ad from Novell would never fly outside of the tech community:
So how can I be so confident that Shuttleworth’s vision of becoming the “default alternative”, and not just the default Linux for those geeky enough to try it, will become a reality? Because he very clearly tied it to a vision of platform. If Ubuntu can work well on every device users encounter (including non-Intel smartbooks and other new classes of portable devices that will be emerging in the next couple of years, displacing notebooks for many consumers), then name recognition will follow.
Obviously, the PC space is dominated by Windows. Yet no matter how spiffy Windows 7 is (and even Shuttleworth acknowledged that it was a good OS, worthy of competing with Ubuntu), Vista taught us all a lesson (consumers and techies alike). There are alternatives to the latest and greatest from Microsoft, even if that’s Windows XP. We don’t have to upgrade.
This “PC space” is changing, though. Windows Mobile stinks. Microsoft has no plans to develop Windows on ARM platforms. The cloud is here, not because of the economy, but because of the value businesses perceive in it. Ubuntu is actively developing in all of these spaces and their latest, highly polished OS (available Thursday) shows off many of the technologies.
What forced Microsoft to crank out it’s best OS in years (some might say it’s best ever and certainly the most stable prior to a service pack or two)? Competition. Competition from Apple, certainly, but also a growing awareness of open source concepts in general. Many artists are releasing DRM-free music (and still making money). Books are widely and freely available. Content is everywhere, much of it for free. Something that you pay for, then, like Windows, better be a heck of a lot better than its free alternatives. Competition is our friend, whether we’re consumers, pro users, or CIOs.
Microsoft may very well continue to dominate the desktop PC space. However, a quick look around at the variety of ways people access online content and cloud-based resources suggests that the importance of the desktop PC as we know it is diminishing. Ubuntu is ready to capitalize on that in ways that the average consumer won’t recognize until he or she finds him or herself using Ubuntu on a MID, a netbook, a kiosk, a phone, a virtualized OS, or a smartbook. Can Apple, Microsoft, or any other Linux distributor say that? Competition might be our friend, but an ubiquitous platform is the friend of developers who can start creating the next generation of killer apps, easily ported to whatever screen we might be using.
It’s no secret that there are millions of digital photos that never see the light of day. They languish on hard drives, flash memory cards, photo CDs, and other digital media, never to be printed or shared. And though some lucky shots get distributed via photo sharing and social networking sites, the days of snapshot prints that you can pass around to your friends and family are dwindling. Sony Electronics is trying to stem the tide with its new S-Frame DPP-F700 digital photo frame.
The DPP-F700 is an all-in-one digital photo frame that not only displays photos on a 7-inch widescreen LCD, but also prints out 4×6-inch snapshots, using a built-in dye-sublimation technology printer. And when Sony says all-in-one, it means all-in-one: you can use the device to do some basic photo editing, such as enlarging, reducing, cropping, and adjusting sharpness, brightness, contrast and hue, as well as print out calendars and other predefiined image templates.
An automatic sensor rotates portrait- or landscape-format images appropriately and offers multiple playback options, such as single images, thumbnails, or slideshows with 10 built-in transitions.
The frame accepts most flash memory formats — including SD, SDHC, MMC, CompactFlash, xD-Picture Card, and of course Memory Stick Pro and Memory Stick Pro DUO cards — as well as USB input from your PC.
And if you actually just want your digital snaps to languish, there’s a gigabyte of internal storage that automatically downsizes your photos to store up to 2,000 images.
The frame/printer will sell for about $200 when it ships in January and is compatible with SVM-F series photo paper packs for Sony Picture Station printers. The cost of consumables per print varies from about 50 cents a print with the SVM-F40P pack (which includes 40 sheets of 4×6 paper and a printer ribbon for $19.99) to about 30 cents per print with the SVM-F120P (which includes 120 sheets of 4×6 paper and two printer ribbons for $34.99).
The White House announced the move in an Associated Press story that somewhat clumsily tried explaining, "the programming language is written in public view, available for public use, and able for people to edit." Debugging and upgrading the site's code "now...can be done in the matter of days and free to taxpayers."
Well, sort of. First of all, Drupal is a program, not a programming language, and second, just because software is available for free doesn't mean that using it is free. It takes time and expertise to install, configure, and maintain software. Indeed, Drupal and Acquia founder Dries Buytaert said in a blog posting announcing the White House's use of Drupal that companies involved in the Web site switch included not just his but also General Dynamics Information Technology, Phase2 Technology, Akamai, and Terremark Federal Group.
And although open-source software in general can offer a tight feedback loop between the programmers creating the software and the people using it, there's no guarantee that debugging and security patches automatically arrive faster or that software is easier to maintain than with proprietary software.
This move is just the sort of thing that can lead to a lot of misunderstandings about the idea of openness, a term that's up there with motherhood and apple pie these days when it comes to values everybody wants to embrace. Don't confuse the fact that Drupal is cooperatively created and debugged in public with the openness of the present administration's government.
This line in the AP story in particular raised my hackles: "Aides joked that it doesn't get more transparent than showing the world (the) code that their Web site is based on."
That's just silly. Drupal-powered blogs and forums can enable online information sharing and public participation in discussions, but that sort of thing can be accomplished with proprietary software as well. Likewise, it's perfectly possible to use open-source software in a system that's locked-down and closed.
That's not to pluck the feather out of Drupal's cap--or indeed out of the caps of Red Hat's Linux operating system, Apache software for hosting Web site and powering its search, and the MySQL database, all of which also are used in the White House project, according to publisher, tech pundit, and open-source fan Tim O'Reilly.
It's not without reason that open-source software is very popular to power Web properties, including plenty of high-powered ones such as Google and Facebook. The White House's move is an endorsement that could help others--notably the many customers in the federal government itself--feel more comfortable with open-source software.
Today Windows 7 hits the GA or general availability milestone. That means that you’ll be able to pick up a PC with the OS pre-installed on it, or pick up a disc from your favorite virtual or bricks-and-mortar outlet. To celebrate, here are my top 7 favorite Windows 7 features.
#1 - Performance boost
Without a doubt the top Windows 7 feature for me is performance. On every system that I’ve put Windows 7 on, from monster quad core rigs to humble netbooks, I’ve seen a performance boost.
Putting numbers on this performance hike, on key metrics such as boot-up, video encoding and gaming frames per second, the boost over Vista is, on average, in the region of 10%.
In my opinion, this performance boost is Windows 7’s strongest selling point.
#2 - More restrained UAC
The User Account Control (UAC) experience under Vista was a little like being shot in the face with a shotgun filled with dialog boxes. A single simple action could unleash a barrage of warnings that left many users feeling confused, bewildered and angry.
Under Windows 7, UAC is a little more restrained, limiting prompts to specific actions. Some might argue that this decreases the security it offers, but I think that the “dialog blindness” that the old UAC caused is worse.
Another good side of the new, improved UAC is the fact that users can customize the level of warnings they receive and so set up the system to best suit their needs.
#3 - 64-bit becomes the new default
Microsoft, along with the big OEMs, are pushing 64-bit flavor of Windows harder than ever. Given that hardware, software and driver support for 64-bit is now at an all-time high, there’s very little reason for users not to migrate to 64-bit.
Within a few years I expect the Windows 7 effect to start to erode 32-bit’s strangle-hold on PCs.
#4 - Improved troubleshooting tools
When users hit a problem, what they want to do is find a solution or fix and get on with their day. To help users accomplish this, Microsoft has incorporated numerous troubleshooters into Windows 7.
No troubleshooter is perfect, but the work that Microsoft has done in Windows 7 will help many users fix problems for themselves without having to resort to tech support or trawling the web for answers.
#5 - UI improvements
No one can say that the Windows 7 UI is revolutionary, but the evolutionary changes that Microsoft has made in this new OS are almost all steps in the right direction. There are two aspects of the UI that have been tweaked:
- Helping users find the applications and documents they want to work with
- Once the user has found what they want, the UI fades into the background and allows the user to get on with things
#6 - Touch support
It’s going to be a while until the built-in touch support incorporated into Windows 7 really takes off, but there’s no doubt that Microsoft’s inclusion of support for touch-screens right into the Windows 7 OS will encourage OEMs to offer more systems with this cool feature.
#7 - XP Mode
I’m not a huge fan of the XP Mode feature available in some editions of Windows 7 that allows users to run XP within a virtual machine from the desktop. However, for those folks with specific bits of software that won’t work on a later OS, then XP Mode does offer a lifeline.
Bonus favorite feature - It’s not Vista. ‘Nuff said!
So, is it a good OS? Yes. In fact, I agree with Ed Bott when he calls the OS “impressive.” However, that said, I can’t see any really compelling reason to rush adoption. Take your time, Windows 7 will be there waiting for you 6 months of a year down the line. Over that time it’ll get better, and hardware/driver support will get better, so everyone’s a winner.
So, join the party and upgrade now, or wait and upgrade later. Or stick with what you are already running. Or go with a Mac or Linux … The choice if yours.
Microsoft is really turning up the consumer-focused volume on the Windows 7 launch on October 22 — despite the fact that the company makes a lot of money from selling Windows to business users, either via volume-license agreements and/or various other channels.
Why so much attention on retail — where Microsoft admittedly garners the least amount of Windows revenues? Microsoft officials believe if they can win over consumers with Windows 7, these consumers will push their workplaces to move to Windows 7 more quickly. There’s also, undoubtedly, a large helping of Apple envy/fear that’s part of Microsoft’s consumer push.
To kick off the launch activities, Microsoft unveiled on October 21 a number of retail deals for the product that it has forged with some of its PC partners and retail outlets. Microsoft is calling the promotion “7 Days of Windows 7.” Company officials said to expect more Windows 7 deals to be added throughout the coming week and to check back on Windows.com if you’re in the market for new hardware, upgrades, support, etc.)
Day 1 (October 22) offers include:
* Best Buy offering Full home technology remodel, handyman included. Best Buy PC Home Makeover
* HP laptop, netbook, desktop and monitor package with Windows 7. Geek Squad wireless home network with router and new PC setup is included for $1,199.00
* Dell Studio XPS 13. Save more than $100.00
* Acer AZ5610-U9072 23″ Touch All-in-One (with Windows Touch) for $880.00
* With the Buy a PC, Get a Discounted Upgrade offer, customers who buy a new PC running Windows 7 Home Premium can upgrade a Windows XP- or Windows Vista-based PC they already own with a discounted box copy of Windows 7. This offer will run through Jan. 2, 2010.
* The Windows 7 Family Pack is available tomorrow in select countries while supplies last. With this offer, consumers can buy three Upgrade licenses of Windows 7 for one price.
* The Student Offer begins tomorrow. For a limited time, the Windows 7 Student Offer gives college and university students in the U.S. and select markets worldwide the opportunity to purchase Windows 7 Home Premium Upgrade or Windows 7 Professional Upgrade for a discount.
I’m weighing which Windows 7 machine to buy and am open to suggestions. I’m looking for something that’s lightweight and very durable. (I’m actually considering buying both a netbook and a laptop, making the laptop my primary machine and the netbook what I take on the road.) I don’t care about running games. I don’t want or need touch. I do value battery life and don’t need anything flashy (though something with a little personality would be nice). Any suggestions out there?
One other note: If you’re in the New York City area on October 22, feel free to come by our post-launch party. It’s at the Antarctica Bar on Hudson and will start around 5 p.m. Lot of Microsoft bloggers — including Ed Bott of ZDNet, Paul Thurrott of the Windows SuperSite, Tom Warren of NeoWin and more will be there. We’re also expecting Most Valuable Professionals, testers, a few brave/crazy Softies and other hangers-on to show up to sample the seven beers on tap. Hope to see you there!
Today is Windows 7 launch day. Here are some launch day offers that might be of interest to you.
Microsoft is introducing a limited-time series of offers known as “7 Days of Windows 7” with amazing deals on hardware, upgrades, support and other options. Watch for new offers to be released daily on Windows.com.
To kick off Day 1 tomorrow, Microsoft is offering PCs that are targeted at simplifying consumers’ lives. There are a host of offers available for customers, including:
* Best Buy. Full home technology remodel, handyman included. Best Buy PC Home Makeover
- HP laptop, netbook, desktop and monitor package with Windows 7. Geek Squad wireless home network with router and new PC setup is included.
$1,199.00
* Dell Studio XPS 13. All the speed you’ll need.
- Simply put, everything you do on your PC will be easier with a fast, high-performing laptop.
Save more than $100 on a Dell Studio XPS13
* Acer AZ5610-U9072 23″ Touch All-in-One (with Windows Touch)
- Touch capabilities and all-in-one elegance with an integrated PC and monitor
- This stunning, All-in-One PC with Windows Touch incorporates intuitive multi-touch technology for exceptional high-definition (HD) entertainment at your fingertips. An elegant, bring exceptional HD entertainment to your fingertips.
$880.00
With the Buy a PC, Get a Discounted Upgrade offer, customers who buy a new PC running Windows 7 Home Premium can upgrade a Windows XP- or Windows Vista-based PC they already own with a discounted box copy of Windows 7. This offer will run through Jan. 2, 2010.
Other offers available include:
* The Windows 7 Family Pack is available tomorrow in select countries while supplies last. With this offer, consumers can buy three Upgrade licenses of Windows 7 for one low price.
* The Student Offer begins tomorrow. For a limited time, the Windows 7 Student Offer gives college and university students in the U.S. and select markets worldwide the opportunity to purchase Windows 7 Home Premium Upgrade or Windows 7 Professional Upgrade for a significant discount.
The new BlackBerry 9700 from T-Mobile comes as no surprise given all the leaks. I was actually in a T-Mobile store a couple months ago and a salesperson mentioned it was coming in an offhand manner. T-Mobile made it official today though and announced that the BlackBerry Bold 9700 would be coming in “time for the holidays”. This is T-Mobile USA’s first 3G BlackBerry and it looks sweet. It also has a speedy 624 MHz processor that should really fly.
Other specs include 256MB of Flash memory with a microSD card slot, Bluetooth, WiFi (with support for VoIP calls over the T-Mobile HotSpot @Home network), GPS, 3.2 megapixel camera, touch sensitive trackpad (no trackball on this device), 3.5-mm headset jack, 360 x 480 resolution display and it runs the new BlackBerry OS 5.0. I see on the T-Mobile site there is also a new service called PrimeTime2Go that offers TV viewing on the go.
The BlackBerry Bold 9700 will be available for $199.99 with a 2-year contract.
Windows 7 is impressive. That word is rarely used in the same sentence as “Microsoft” and “Windows” – certainly not in recent years. But it fits here.
Unlike its predecessors, this Windows version feels as if it were designed and built by a single, coordinated team instead of being assembled from interchangeable parts. In daily use, Windows 7 feels graceful and often (but not always) elegant. Although it builds on elements that debuted in Windows Vista, it fixes many usability sins and adds consistency and polish to an interface that had too many rough edges. And some very impressive new capabilities, especially the grossly underrated Libraries feature, offer rewards for digging deeper.
Windows 7 runs smoothly and efficiently on even modest hardware. Remarkably, it reverses the longstanding trend to make Windows bigger. From a standing start, Windows 7 uses less memory, runs fewer services, and consumes less disk space than its predecessor, Windows Vista, and in the 64-bit version it can address about five times more RAM than you can actually stuff onto a single motherboard. This year, anyway.
I’ve already covered the features in Windows 7 extensively. Little in Windows 7 has changed since I wrote What to expect from Windows 7 back in May. If you review the screenshot gallery I assembled for that post, you’ll have a very good idea of how Windows 7 looks and acts today (the sole exception is Windows XP Mode, which has changed significantly from the beta release I looked at in May).
When Windows Vista was released in January 2007, I suggested that most businesses of even modest size and complexity would be wise to heed conventional wisdom and avoid it until Service Pack 1 was ready. I don’t feel compelled to offer that same advice here. The development process for Windows 7 has been steady and deliberate. The Release Candidate code that Microsoft made public last May was arguably more stable and reliable than most recent official Windows releases. As I wrote in What to expect from Windows 7 nearly six months ago:
From a features and capabilities point of view, Windows 7 is essentially done. It’s all over but the process of hunting down bugs, many of them associated with OEM hardware and drivers. In a bygone era, code this stable and well tested might have been released as a 1.0 product, followed six months later by a service pack. Not this year. Microsoft is treating Windows 7 as the world’s most ambitious shareware release ever.
I’m told that 8 million people have been running the Windows 7 Release Candidate. That’s four times the number of people who registered as Windows Vista beta testers during its development process. My gut feeling is that the number of people actually using Windows 7 in recent months is at least an order of magnitude higher than the corresponding head count in the runup to Windows Vista. And based on everything I’ve heard, the overwhelming majority of those who try Windows 7 like it.
So, who should upgrade? And who shouldn’t? As always, I don’t believe in one-size-fits-all recommendations. But for a few categories, the choice is simple:
* If you’re running Windows Vista and gritting your teeth over it, you should upgrade as soon as possible. The relief will be immediate.
* If you’re shopping for a new PC, get one with Windows 7 on it. And if it doesn’t run properly on Day 1, return it and find another. OEMs that do a good job of matching PC hardware to Windows should be rewarded. Those who didn’t learn from the Vista experience deserve to be punished.
* If you’re perfectly happy with the performance of XP and don’t want to relearn established habits, stay put.
* For anyone relying on mission-critical Windows-based apps or specialized hardware, testing trumps any desire to have the latest OS, no matter how well it’s been reviewed.
And if you’re feeling gun-shy about switching, it’s OK to wait. Most people forget that the venerable Windows XP was unpopular and unloved for its first two years in the marketplace. And Windows Vista has matured into a solid, if forgettable OS after many reliability updates and two service packs. Based on that experience, Windows 7 will improve with age.
Yes, there are downsides to the Windows 7 transition. For Windows XP users in particular, the upgrade process is tedious. Licensing is still a confusing mess, especially for small business owners. Drivers are still a potential source of headaches, as I’ve found in recent months.
But its improvements in productivity, security, and reliability make Windows 7 worth those short-term hassles. It is, without question, the most impressive software development effort Microsoft has ever undertaken. For anyone who has chosen Windows – out of preference or necessity – it is an impressive achievement and as close to an essential upgrade as I have ever seen.
Gangs know what encryption is. They are using it in force at the street level, let alone at the very top. Rim’s BlackBerries are the ultimate in security for them. Everything is secured and impossible to monitor by police.
Rim’s Blackberry Enterprise Server (BES) is one of the most sophisticated platforms for email and PIN messages. This system used to be the domain of big corporations. No longer. One of the reasons many financial brokerage institutions ban the use of PIN messages is because they can’t be tracked. In 2005 this was big news and was reported widely. Canadian Imperial Bank of Commerce (CIBC) and Royal Bank of Canada (RBC) banned its use. Organized crime picked up where the banks left off.
RCMP Insp. Gary Shinkaruk, head of biker gang investigations in B.C., said BlackBerries are “extremely common” among the criminals his unit investigates.“For a lot of groups, it’s standard practice,” he said.
The RCMP legendary motto maybe heading to the delete bin and may not be able to always get their man after all…
The IT industry is exiting its worst year ever, as worldwide IT spending is on
pace to decline 5.2 percent, according to Gartner, Inc. Worldwide enterprise IT
spending will struggle more with IT spending dropping 6.9 percent. The IT
industry will return to growth with 2010 IT spending forecast to total $3.3
trillion, a 3.3 percent increase from 2009.
Gartner provided the latest outlook for the IT industry during Gartner
Symposium/ITxpo, which is taking place here through October 22. While IT
spending will increase next year, Gartner cautioned IT leaders to be overly
optimistic.
"While the IT industry will return to growth in 2010, the market will not
recover to 2008 revenue levels before 2012," said Peter Sondergaard, senior vice
president at Gartner and global head of Research. "2010 is about balancing the
focus on cost, risk, and growth. For more than 50 percent of CIOs the IT budget
will be 0 percent or less in growth terms. It will only slowly improve in 2011."
The computing hardware market has struggled more than other segments with
worldwide hardware spending forecast to total $317 billion in 2009, a 16.5
percent decline. In 2010, spending on hardware spending will be flat. Worldwide
telecom spending is on pace to decline 4 percent in 2009 with revenue of nearly
$1.9 trillion. In 2010, telecom spending is forecast to grow 3.2 percent.
Worldwide IT services spending is expected to total $781 billion in 2009, and it
is forecast to grow 4.5 percent in 2010. Worldwide software spending is forecast
to decline 2.1 percent in 2009, and the segment is projected to grow 4.8 percent
in 2010.
On a regional basis, emerging regions will resume strong growth. "By 2012, the
accelerated IT spending and culturally different approach to IT in these
economies will directly influence product features, service structures, and the
overall IT industry. Silicon Valley will not be in the driver`s seat anymore,"
Mr. Sondergaard said.
From a budget perspective, there are three important items that IT leaders must
consider in 2010:
1 A Shift from Capital Expenditure to Operational Expenditure in the
IT Budget — Concepts such as cloud services will accelerate this
shift. IT costs become scaleable and elastic. CIOs need to model the
economic impact of IT on the overall financial performance of an
organization. For public companies, they must show how IT improves
earnings per share (EPS).
2 Impact of the Increased Age of IT Hardware — With delayed
purchases of servers, PCs and printers likely to continue into 2010,
organizations must start to assess the impact of increased equipment
failure rates, and if current financial write-off periods are still
appropriate. Approximately 1 million servers have had their
replacement delayed by a year. That is 3 percent of the global
installed base. In 2010, it will be at least 2 million. “If
replacement cycles do not change, almost 10 percent of the server
installed base will be beyond scheduled replacement be 2011,” Mr.
Sondergaard said. “That will impact enterprise risk. CFOs need to
understand this dynamic, and it’s the responsibility of the CIO to
convey this in a way the CFO understands.”
3 IT Must Learn to Build Compelling Business Cases — 2010 marks
the year in which IT needs to demonstrate true line of sight to
business objectives for every investment decision. IT leaders can no
longer look at IT as a percentage of revenue. CIOs must benchmark IT
according to business impact.
Mr. Sondergaard said three additional topics that were important in 2009 will
continue to dominate IT leaders` agendas in 2010. These three topics include
* Business Intelligence - Users will continue to expand their investments in
this area with the focus moving from "in here" to "out there"
* Virtualization - IT leaders should not just invest in the server and data
center environment, but in the entire infrastructure. In 2010, users will create
the cornerstone for the cloud infrastructure. They will enable the
infrastructure to move from owned to shared.
* Social Media - Organizations are starting to scale their efforts in this
space. The technologies are improving and organizations realize this is not only
about digital natives. It`s about all client segments including the most
significant: the population in the next 10 years, the above 60 year old
generations.
While those topics are key to IT agendas today, Mr. Sondergaard highlighted
three themes that will become important going forward. They include:
* Context-Aware Computing - This is the concept of leveraging information about
the end user to improve the quality of the interaction. Emerging
context-enriched services will use location, presence, social attributes, and
other environmental information to anticipate an end user`s immediate needs,
offering more sophisticated, situation-aware and usable functions.
* Operational Technology (OT) - OT is devices, sensors, and software used to
control or monitor physical assets and processes in real-time to maintain system
integrity. The rapid growth of OT is increasing the need for a unified view of
information covering business process and control systems. OT will become a
mainstream focus for all organizations.
* Pattern-Based Strategy - This is a new model about implementing a framework to
proactively seek, model, and adapt to leading indicators, often termed "weak"
signals, that form patterns in the marketplace, and to exploit them for
competitive advantage. A Pattern-Based Strategy will allow an organization to
not only better understand what`s happening now in terms of demand, but also to
detect leading indicators of change, and to indentify and quantify risks
emerging from new patterns rather than continuing to focus on lagging indicators
of performance.
About Gartner Symposium/ITxpo
Gartner Symposium/ITxpo is the industry's largest and most important annual
gathering of CIOs and senior IT executives. This event delivers independent and
objective content with the authority and weight of the world's leading IT
research and advisory organization, and provides access to the latest solutions
from key technology providers. Gartner's annual Symposium/ITxpo events are key
components of attendees' annual planning efforts. They rely on Gartner
Symposium/ITxpo to gain insight into how their organizations can use IT to
address business challenges and improve operational efficiency.
Upcoming dates and locations for Gartner Symposium/ITxpo include:
October 18-22, Orlando, Florida: www.gartner.com/us/symposium
November 2-5, Cannes, France: www.gartner.com/eu/symposium
November 11-13, Tokyo, Japan: www.gartner.com/jp/symposium
November 17-19, Sydney Australia: www.gartner.com/au/symposium
About Gartner
Gartner, Inc. (NYSE: IT) is the world's leading information technology research
and advisory company. Gartner delivers the technology-related insight necessary
for its clients to make the right decisions, every day. From CIOs and senior IT
leaders in corporations and government agencies, to business leaders in
high-tech and telecom enterprises and professional services firms, to technology
investors, Gartner is the indispensable partner to 60,000 clients in 10,000
distinct organizations. Through the resources of Gartner Research, Gartner
Executive Programs, Gartner Consulting and Gartner Events, Gartner works with
every client to research, analyze and interpret the business of IT within the
context of their individual role. Founded in 1979, Gartner is headquartered in
Stamford, Connecticut, U.S.A., and has 4,000 associates, including 1,200
research analysts and consultants in 80 countries. For more information, visit
www.gartner.com.
Gartner
Christy Pettey, + 1 408 468 8312
christy.pettey@gartner.com
Even since I have been stalking my high school buddies on Facebook in 2004, I can remember the thrill I got inside: “Man, this site is addictive, and I don’t see it going away any time soon”.
As with any growing engine, you have to make sure it doesn’t get too hot too fast, or it will bust. Facebook isn’t at that point yet, but I get nervous when they shell out $50 million to acquire a company that is striving to do the same thing, but do it better. I get even more nervous when that smaller, better site becomes a ghost town, as MG Siegler puts it on Techcrunch.
I’m talking about Friendfeed, a site that has a great user experience for communicating with people and sharing rich content. They do a better job than Facebook when it comes to presentation and interactivity; ask anyone who uses the two regularly. Friendfeed does “real-time” better (I know, I hate that buzz word). Comments come in as they happen… but the problem is, no one is listening.
Everyone is hanging out on Facebook. No one uses Friendfeed anymore.
Remember that Microsoft .NET Framework Assistant add-on that Microsoft sneaked into Firefox without explicit permission from end users?
Well, the code in that add-on has a serious code execution vulnerability that exposes Firefox users to the “browse and you’re owned” attacks that are typically used in drive-by malware downloads.
The flaw was addressed in the MS09-054 bulletin that covered “critical” holes in Microsoft’s Internet Explorer but, as Redmond’s Security Research & Defense team explains, the drive-by download risk extends beyond Microsoft’s browser.
A browse-and-get-owned attack vector exists. All that is needed is for a user to be lured to a malicious website. Triggering this vulnerability involves the use of a malicious XBAP (XAML Browser Application). Please not that while this attack vector matches one of the attack vectors for MS09-061, the underlying vulnerability is different. Here, the affected process is the Windows Presentation Foundation (WPF) hosting process, PresentationHost.exe.
While the vulnerability is in an IE component, there is an attack vector for Firefox users as well. The reason is that .NET Framework 3.5 SP1 installs a “Windows Presentation Foundation” plug-in in Firefox.
Now, Microsoft’s security folks are actually recommending that Firefox users uninstall the buggy add-on:
For Firefox users with .NET Framework 3.5 installed, you may use “Tools”-> “Add-ons” -> “Plugins”, select “Windows Presentation Foundation”, and click “Disable”.
This introduction of vulnerabilities in a competing browser is a colossal embarrassment for Microsoft. At the time of the surreptitious installs, there were prescient warnings from many in the community about the security implications of introducing new code into browsers without the knowledge — and consent — of end users.
[ SEE: Microsoft says Google Chrome Frame doubles IE attack surface ]
This episode also underscores some of the hypocrisy that has risen to the surface in the new browser wars. When Google announced it would introduce a plug-in that runs Google Chrome inside Microsoft’s Internet Explorer, Microsoft whipped out the security card and warned that Google’s move increased IE’s attack surface.
“Given the security issues with plug-ins in general and Google Chrome in particular, Google Chrome Frame running as a plug-in has doubled the attach area for malware and malicious scripts. This is not a risk we would recommend our friends and families take.”
Of course, when it’s Microsoft introducing the security risk to other browsers (Silverlight, anyone?), we should all just grin and take it.
About 10 years ago, health information technology experts were touting speech recognition software as one of the best things to come to health care. About five years ago, the talk was about why adoption of the technology never took off like it was expected.
Physicians' chances to earn incentives from the federal stimulus package have forced many to look at speech recognition as a way of making electronic medical record adoption more palatable to resistant physicians. And because improvements have made speech recognition more reliable and easier to use, some experts and physicians say speech recognition is finally ready to take off.
But will it? Experts say that at least some practices are willing to try it out again.
Megan Hastings, vice president of Health Directions, a Chicago-based consulting firm, said she has seen an increase in practices wanting speech recognition in recent months.
"When you get into a situation where the physician can't type, or [EMR adoption] is severely impacting their clinical work flow, we often see that institutions and practices will purchase [a speech recognition system] to help those physicians struggling to get on board," she said.
It's also helped that the technology has improved greatly from several years ago, Hastings said.
Larry Garber, MD, an internist and medical director for informatics at Fallon Clinic, which has more than 20 locations in Central Massachusetts, tried the technology in 2004. But he quickly discovered that it wasn't going to work. He stopped plans to roll it out to the entire practice.
"It wasn't ready for prime time," Dr. Garber pointed out. "Now it is. No question."
Dr. Garber's practice adopted the technology last year and found that the nagging problems such as speed and accuracy had been resolved. Using the speech recognition software not only has improved quality but also has saved the practice an average of $7,000 per physician per year in transcription fees. That never would have happened with the old technology, he said.
Technology more accurate
Keith Belton, senior director of product marketing for Burlington, Mass.-based Nuance Communications, which developed Dragon Medical, the most widely used speech recognition system in the medical industry, said he admits the technology was lacking in previous versions. But he said it now boasts an accuracy rate of 99%.
When Robert Frank, MD, a family physician with Aurora Advanced Healthcare in Milwaukee, first used the technology in 2004, he said many physicians felt it actually slowed them down. They were forced to go back and edit all of their notes because of the lagging accuracy, Dr. Frank said.
Newer versions are not only more accurate but also more robust, their vendors say. Systems now can work in concert with EMRs, for example.
Dr. Garber's internal study of how speech recognition was affecting efficiency at his practice found that EMR adoption alone didn't improve the availability of information.
The majority of physicians still were using traditional dictation services, which had about a four-day turnaround time, not including the time it took to do edits, once the transcripts were returned. The addition of speech recognition reduced that time to an average of 46 minutes.
Physicians now dictate clinical notes, which are transcribed directly into the patient's file, he said. Once the notes are complete, physicians check for accuracy, then approve them, and they are automatically saved to the record. The system also allows physicians to enter text, such as lab results or medicine lists, using a voice command shortcut.
That might make physicians faster at getting clinical notes into the system, said Mike Uretz, executive director of EHR Group, a Seattle-based consulting company. But if they want to qualify for incentive money for adopting EMRs, voice recognition used in conjunction with an EMR could be an impediment.
"The problem is when you talk into it, the data is not discrete ... it's still like a Word document," he said.
To meet the stimulus package's "meaningful use" requirements on EMRs, physicians will have to do quality reporting, which requires discrete data, Uretz said. Data are "discrete" when they can be categorized into specific fields.
Dr. Belton said it's likely that future versions voice recognition systems that work with EMRs will have this functionality.
Dolan is a business reporter. She can be reached at 312-464-5412 or by e-mail (pamela.dolan@ama-assn.org).
From staff reports
National College’s Danville campus is offering two free seminars on Thursday and Friday for residents interested in information technology careers, according to a news release.
“The information technology sector continues to see increasing demand,” Danville campus director Mark Evans said via the release. “Computer and networking technology touches every industry and the need for trained professionals to manage information systems isn’t going to disappear any time soon.”
As a Microsoft I.T. Academy, National College provides students with advanced instruction and access to the latest software, the release stated. The Danville Campus offers both an associate’s degree program in information systems engineering or a diploma in desktop support.
Tom Jackson, director of information technology programs for National College’s 25-campus system, will serve as the guest speaker.
Attend the seminars from 5:30 to 6:30 p.m. on Thursday and 10 to 11 a.m. on Friday. Contact the campus at (434) 793-6822 to reserve a spot.
Gearing up for the Windows 7 launch, Toshiba on Wednesday has announced its new notebook and netbook lineup, including new touchscreen models.
Touchscreen capability comes on the new 13.3-in. Satellite U505 (pictured, above), which weighs 5 lbs., retails for $1,049 and will arrive on Nov. 1. Similiarly, the 14.4-in. Satellite M505 carries a price tag of $949 and a date of Oct. 22.
Both are loaded with the LifeSpace software package that includes Bulletin Board, an organizational tool, and ReelTime, a visual search aid.
Also announced were the following, with quick summaries and links:
* Satellite A500: 16-inch HD Edge-to-Edge display on select models, Intel Core 2 Duo or AMD Turion II Ultra processor, discrete graphics options, optional Blu-ray, starts $589.99.
* Satellite L500: 14″ to 17.3″ displays, AMD Turion II and Athlon II processor, up to 500GB HDD, starting from $504.99 to $579.99.
* Satellite P500: 18.4-in. HD TruBrite display, Intel Core 2 Duo or AMD Turion II processors, Blu-ray in some models, starts $799.99.
*
* Qosmio X505: 18.4-in. display, optional 64GB SSD/320GB HDD dual-drive configuration, $1,899.99.
* mini NB205: 10.1-in. netbook, nine-hour battery life, up to 250GB HDD, five colors (brown, white, blue, pink, black), starts $399.99.
Finally, Toshiba also announced sleep-and-resume functionality for the recently announced Satellite T100 ultrathin laptop.
The kind folks at Motorola allowed me to spend the last week walking around New York City with their new Cliq smartphone, and having spent that time getting to know the device a bit better, I’m comfortable discussing what I like about it and what I don’t.
First, a refresh: The Motorola Cliq smartphone is Motorola’s first modern entry into the hot smartphone space and will land on T-Mobile on Nov. 2 for $199 with a two-year contract. The phone is a touchscreen messaging phone, and has a slide-out QWERTY keyboard with D-Pad in addition to a 3.1-inch (320×480) display.
It runs on the Google Android platform, but has the company’s in-house layer of software and services, called Motoblur, integrated with it.
The phone is important in several ways. First, it is Motorola’s first major play in the smartphone space, a critical product for a company that has struggled for a hit since the runaway success of the Razr.
Second, it is the latest Android-based smartphone in a growing army of them from HTC, Samsung and others. While HTC’s G1 (Dream) and myTouch 3G (Magic) were the early birds in the Android game, the Cliq — along with the HTC Hero, the first Android phone on Sprint, and the Samsung Behold II — are the first models to incorporate proprietary software that layers on top of the vanilla Android installation.
Here’s a short video of the device, showing its dimensions, transitions and card-style widgets in action:
With that said, here are eight things I’ve come to like about the Motorola Cliq and seven things I don’t.
What I like:
1. Widgets. While Motoblur is part skin, part services and part software, the widget architecture is the most noticeably distinctive aspect of the Cliq. Whether you like it or not, our world revolves around communication, and any advances in simplifying this are welcome. Motoblur improves on the Android experience by offering widgets that can surface messages (e-mail, Facebook, Twitter, etc.) or headlines (RSS, etc.) The Cliq has a five-screen “desktop,” offering space for these units, which take up more room than traditional icons. They update on the fly, and offer a quick way to browse content without forcing you to dive all the way in.
2. Unifying identities. Palm was the first to unify identities with its webOS, but Motoblur keeps pace by doing the same thing with all of your services. While the Cliq doesn’t carry conversations across protocols in one spot like the Pre, it does combine things in sensible ways by allowing you to search for people based on knowledge known from other services. One simple example: when I receive a call from a contact, it displays their Facebook profile picture as well as their latest status update alongside their phone number and the “answer” button.
3. Simplified button schemes. As I mentioned previously, Motorola made the “answer” and “hang up” buttons virtual, leaving just three physical buttons (menu, home, back) on the main control area. Besides the fact that it’s a much nicer design experience, it also saves space.
4. T-Mobile. If you don’t have T-Mobile service in your area, this point is moot, but there’s something to be said for using an underdog carrier in a major metropolitan area. I rarely had issues with speed in terms of downloading web pages the way folks with iPhones on AT&T do here in New York.
5. Versatility. The Cliq offers a virtual keyboard as well as a competent physical one with wide keys. In a previous post, I said this was a power user’s dream. That turned out to be mostly true — sometimes I found myself typing short queries using the virtual keyboard, and sometimes it was easier to use the slide-out version, such as during extended instant messaging. Also: some early reviews questioned the use of a D-Pad, but after spending some time with the Cliq, I found that it was much easier to fix a typo or manipulate text with it than using your finger, like on the iPhone.
6. Development. Android is still in its application infancy compared to the iPhone, but considering the amount of handsets hitting the market, it won’t be long before all the essential apps (news, sports, finance, major social media services) are covered. I was able to find Last.fm (owned by this site’s parent company, CBS), Facebook and geosociolocation app Foursquare rather easily, and the widgets natively handle a number of other services (Twitter, MS Exchange, MySpace, Google, Picasa, Photobucket, Yahoo!).
7. Build quality. While the smooth matte finish on some of the plastics felt a little, well, plasticky, the expensive gunmetal finish on the phone’s metal body exuded quality. The sliding mechanism was a little less resistive than the one on the G1, but quite smooth, and I found all of the outside buttons to be placed in logical places (the tiny indicators for the buttons beneath the slider were a nice touch).
8. Business readiness. The Cliq comes preloaded with Quickoffice, meaning I can view Word, Excel and Powerpoint files on the device. Combine that with the QWERTY keyboard and D-Pad, and you’ve got a device that’s a BlackBerry-killer in terms of usability. (Security/BES, another story.)
What I didn’t like:
1. Sluggishness by hardware. This isn’t inherently the Cliq’s fault, but it’s exacerbated by the widget-heavy Motoblur layer. In my experience with Android phones, I’ve found that they’ve all been just a bit hesitant in terms of how quickly they react to my touch. (I’ve found the iPhone experience to be better overall.) This doesn’t happen all the time; rather, it happens intermittently, which can be frustrating. The reason? Hardware. Anand Shimpi explained in detail yesterday why the 528MHz Qualcomm processor in all of the most recent Android phones is the weak link. The user can help the situation by turning off more services and widgets and things, but then why use a smartphone if you’ve disabled its intelligence?
2. Currency. The widgets were indeed nice, but sometimes they didn’t update frequently enough to be useful. Examples: Twitter, RSS. I found that my e-mail and other essentials updated instantaneously, but on occasion I found that the widgets didn’t surface recent blog posts. Several times, the widgets reflected content that was hours old — years in Twitter time, and an impossibility, given how many people I follow on Twitter alone — and, to my knowledge, there was no way to manually refresh the widget. Side note: when you boot the phone, all the apps try to update at once, and there’s no way of halting this train. It’s unnerving.
3. Upgradeability: That extra layer of services presents a problem for Motorola, who in addition to providing cloud services and resources for the effort are also the gatekeepers to progress on Android development. For example, even though Android 1.6 Donut rolled out to existing Android handsets on the market (G1, mytouch), the Cliq will hit shelves with 1.5 Cupcake. Why? Because the Motoblur architecture is hooked into the Android platform, so Motorola can’t update to 1.6 without also updating its Motoblur software. For sure, some Motoblur features may become standard Android features over time. But that puts the onus of development on Motorola — particularly the pressure to both keep pace with open source Android development as well as its own, and not branch off.
4. Choice. I mentioned versatility as something I liked about the Cliq, and that’s true. But people have preferences — personally, I prefer a lighter, slimmer, full touchscreen phone with no physical keyboard at all (like the HTC Hero, but that’s on Sprint). So there needs to be more choice among form factors for Motoblur. Motorola told me it was preparing another Android phone for launch before 2010, but one without Motoblur. It seems to me that it’s a no-brainer to provide a touchscreen-only Motoblur phone as a foil to the Cliq.
5. Screen size. It seemed a little silly to me that the one surface with the screen on it actually had the least amount of area compared to any other flat surface on the device. Why is this? The Cliq device itself is much smaller (width and height) than the established iPhone (screen: 3.5 in.). The Cliq’s smaller screen means I have less room for those big widgets on my home screen.
6. Media player. Motorola says it didn’t Motoblur-ify the default, underwhelming Google Android media player. That’s a shame, because if we’re really moving toward converged devices, I shouldn’t have to carry around my iPod touch, too, just to get a complete multimedia experience.
7. Android market. My main beefs with Android market are that it’s very hard to surface apps and you can’t tell which apps are legitimate. For example, there’s a New York Times app in the market’s Top 10 apps, but it’s not developed by the New York Times Company. The Android market is still a burgeoning movement, and it was a little ironic that some of the apps for the services promoted on the Cliq’s box weren’t preloaded (Facebook, Last.fm, etc.).
Final thoughts: Motoblur’s a great platform, but the hardware is visibly taxed by it. For T-Mobile users, this is the phone to get. Period. It’s much better than a G1 and a myTouch 3G. For others, it’s a harder sell, and depends on your local carrier situation.
The device is a toss-up with the HTC Hero on Sprint — if you prefer physical keyboards, this is your device. It all depends on your hardware preferences.
That’s not all there is to say about the Cliq, but that gets to the heart of the experience.
Motorola Cliq: deal or no deal?
The Linux desktop experience is now closer to the Windows environment than before, but the gap in mainstream adoption for the open source OS will not close anytime soon, says an industry analyst.
Laurent Lachal, U.K.-based senior analyst at IT advisory firm Ovum, said inconsistencies across Linux distributions still stand in the way of wider user uptake.
"For one, Linux has two main GUIs (graphical user interfaces), KDE and Gnome. Some see that as choice, but overall it confuses the market," Lachal told ZDNet Asia in a phone interview. He added that each GUI is further tweaked for different distributions, further compounding the disparity.
Different distributions also have different ways of allowing users to perform tasks, such as terminal commands.
Some distributions also try to mimic Windows as closely as possible in order to entice Windows users to migrate, but has often resulted in only "good enough" experience for "basic" enterprise tasks.
Lachal said: "Usability is not a problem with Linux, but the issue lies with application support."
John Brand, Hydrasight's research director, said in an e-mail interview such support issues have plagued Linux, and still do. "The majority of organizations still find application incompatibility and lifecycle management an issue for Linux-based desktops," he said.
Linux can be suitable for "light use" by some members of a company, but this mix-and-match approach where both Windows and Linux platforms are deployed is not typically considered cost-effective, Brand explained.
And while Linux desktop projects may rate well with users during pilot deployments, the "complexities of having a mixed environment generally dilutes any benefits Linux may otherwise provide", he added.
He highlighted Microsoft's integration with its other office products that increases the reliance on the Windows OS. The most significant example of this has been Microsoft SharePoint, he said.
"We see that [SharePoint] adoption has become widespread and often entrenched as a core part of the enterprise IT infrastructure," Brand said. Competing software such as open source document management product Alfresco, has not yet managed to appeal to users to a similar degree, he noted.
Furthermore, device support is still an issue, he said. "Without the commercial drivers for open source, market momentum is variable at best," he added.
However, Laurent disagreed. He said lack of driver support "is still an issue, but overall it has been solved".
Greg Kroah-Hartman, Novell programmer and Linux Driver Project lead, noted in an earlier interview with ZDNet Asia, that the "problem" of device makers resisting the Linux community is not an issue. He said the coders at the Linux Driver Project were getting requests to make Linux-compatible drivers for hardware "all the time", suggesting growing adoption of Linux OSes among enterprises.
The netbook example
The biggest gap Linux needs to close is the maturity of its channel, said Ovum's Laurent, adding that the platform lacks vendor support and market visibility.
Although Linux had the headstart in the netbook game--with Asus supporting the open source platform--Microsoft eventually overtook the lead because "the market was not ready".
Laurent said: "Sales people were not trained and did not understand [Linux] because the sales channels were not experienced. Thus, they could not sell [Linux-based] netbooks properly and customers were unhappy."
Furthermore, Microsoft's decision to extend Windows XP's lifespan for the netbook market was sufficient to sway users back into the familiar Windows camp, indicating that consumers tend to prefer what they are most familiar with, he said.
"The netbook example displays the level of inertia that Linux has to fight," he noted, adding that because of this inertia, Linux will remain a "minority" OS for another five years.
"It will be used more extensively in the enterprise but will not dramatically challenge Microsoft or Apple in the consumer space," he said.
Cloud computing, a term used by many Washington information-technology companies these days, may seem simple enough to those of us who use Yahoo e-mail accounts, store pictures on Flickr and upload videos to YouTube.
The basic premise: store your information, photos, documents — pretty much anything — on some company’s remote servers — or in the “cloud” — instead of directly on your computer’s hard drive. That company stores and maintains it for you, and you can retrieve it from any computer as long as you have an Internet connection.
It definitely holds promise. Federal technologists, including Chief Information Officer Vivek Kundra in the White House, hope it will save money for agencies and be a more efficient way for employees to share information and work remotely. And if employees’ information is no longer stored on physical hard drives, they won’t lose data even if they lose their laptops.
The White House recently launched apps.gov, a site that lets chief information officers all over the government shop for pre-approved software programs that make use of “the cloud.” Officials can download software from the site, upload their organization’s information, and they’re off and running.
The selection of applications on the site is still relatively limited as the government vets new ones.
But what is still hazy is how that information is stored and kept safe from security and privacy breaches, how to ensure information can be deleted from remote servers without leaving traces of sensitive information and who is going to provide the services. Firms like Google, Amazon, Microsoft and Salesforce have been clamoring to develop cloud-computing products that are robust enough for government use. And firms that have traditionally provided IT services to the government — IBM, Booz Allen Hamilton and CSC, for example — are trying to defend their turf by building their own expertise in the area.
And the government hasn’t even figured out how to define “cloud computing.” The National Institute of Standards and Technology, which advises agencies on technology use, last week released its 15th version of a working definition. It’s two pages long and begins with a caveat: “Cloud computing is still an evolving paradigm. Its definitions, use cases, underlying technologies, issues, risks and benefits will be refined in a spirited debate by the public and private sectors.”
If you’ve been reading the Hillicon Valley blog, you’re aware that the private sector has already stumbled on this front. Last week, T-Mobile informed some of its customers that a Microsoft subsidiary called Danger lost e-mails, photos and contact lists after its servers crashed. If similar episodes continue to occur, the government will probably be less likely to entrust classified, sensitive information to third-party servers until security and reliability improvements are made.
California Congress members cheer on young homebuilders
In the Solar Decathlon going on right now on the National Mall, the team from Silicon Valley, Team California, is in the lead.
Rep. Mike Honda (D-Calif.) met with the team Wednesday morning at the Capitol to encourage its members in the remainder of the competition, which ends Friday morning.
“We understand, more or less, how important your project is,” he told the team, made up of students from Santa Clara University and California College of the Arts. “A lot of times I wish I had one of those homes. For a single guy living alone, that’s all I need for entertaining.”
The team’s project — Refract House — won first place in the architecture, hot water and communications categories. The house, which uses green materials and energy-saving technologies, will next be judged on its net power production, although the team from Germany is expected to have a leg up in that round. The U.S. Department of Energy puts on the annual competition.
Rep. Zoe Lofgren (D-Calif.), who attended Santa Clara University’s law school, stepped in to say a few words to the 20 or so students on the team.
“We know we cannot continue our dependence on foreign oil,” she said. “Climate change is real and what you’re doing is part of the solution.”
Ken Reidy, a staffer for Rep. Russ Carnahan (D-Mo.), is also an alumnus of Santa Clara University and swung by the short event to show his support. Carnahan co-chairs the High Performance Buildings Caucus and has taken an interest in sustainable, eco-friendly homes.
Something is really odd here.
As a reporter covering Facebook, I do get the occasional cranky complaints from members who, for one reason or another, are experiencing errors when they try to access their accounts. But it's never been anything like the past week, with a steady stream of e-mails continuing to come in from Facebook members who say they remain shut out of their accounts--despite assurance from Facebook that profiles have not been deleted and that the company is working on the problem.
"This is now seven days and counting," an e-mail sent on Saturday morning read. "It's beyond ridiculous and extremely frustrating."
"The experience completely reversed the Facebook opinion and experience for me," one reader complained. "I see many people bitch and complain, many more beg and a few threaten. To me, the route to take is fairly obvious. Mark Zuckerberg on his own page invites democratic input from Facebook users in one of his most recent videos. Given that statement especially, I find the way their user base is being treated with respect to their disabled account policy hypocritical at best."
"My account has now been held hostage for a week," another reader wrote. "Some of my friends think that I have deleted (my profile) or even blocked them...None of my friends or family can see my profile or even find it in search. It's as if I simply deleted my account or blocked all of them from seeing it without even a word."
Some users have started threads on Get Satisfaction and Yahoo Answers. A few others have pointed me to blogs and YouTube channels devoted to the subject.
The inaccessible accounts appear to be limited to a very small subset of Facebook's over 300 million active users, which means that it's not a large-scale issue for the health of the site. And Facebook is supported by neither subscription money or taxpayer dollars (though it wouldn't have advertising revenue without its users) so there's an argument to be made that users shouldn't be complaining about something they don't pay for. But that's an argument that many of the people who have come to rely on Facebook as a channel of communication simply don't buy.
Whether the string of complaints is warranted or not, Facebook hasn't disclosed exactly what's caused the "extended maintenance issue," and that's what I find puzzling.
If Windows 7 has a killer feature, it’s search. As I demonstrate in this week’s screencast, you can find search boxes throughout Windows 7—on the Start menu, in Control Panel, and in Windows Explorer. The indexed search is fast and accurate, in my experience, and the indexing process itself is barely noticeable in terms of performance. The best change, though, is the addition of the Search Builder, which replaces the clunky search forms from earlier versions and allows you to filter a results set by date, type, size, or an attribute that’s appropriate to a particular type of data such as music or photos.
Every time I write about search, at least a half-dozen commenters show up in the Talkback section to proclaim that it’s unnecessary if you know how to organize your files into subfolders. But they miss the point completely. A well-managed filing system and a fast search index work together beautifully. As an author, for example, how should I keep my files organized? Should I have every document related to a single project in its own subfolder? Or should I keep contracts in one folder, proposals and outlines in another, drafts in yet another, and finished chapters elsewhere? And even if I’ve done a perfect job of naming and organizing those files, how do I find the contract that had the clause about foreign publication rights that I need to discuss with my agent in five minutes? A good search tool can track that file down in seconds. Without it, I’d have to find every contract in every folder and open each one to see what’s inside.
This is the third of four Windows 7 demos I’ve done in this series. Look for the final screencast in the series next week at this time.
Medical Website Design Company Aurora Information Technology Discusses the Importance of Personal Branding in the Healthcare Industry
GARRISON, NY--(Marketwire - October 13, 2009) - Aurora Information Technology, Inc., a medical website design and medical marketing company based in Garrison, NY, believes in the concept of personal branding, especially when it comes to healthcare. In this age of healthcare consumerism, patients are increasingly taking control of their choices in doctors and treatments. There is a huge demand for healthcare providers, but the supply is equally as large. Patients can afford to be selective, because it is no longer a case of a big fish in a little pond. Healthcare has become a big pond with an abundance of big fish.
In a recent BusinessWeek article ("Authentic personal branding starts with the real you" by Marshall Goldsmith, September 29, 2009), Dr. Hubert Rampersad, an authority on authentic personal branding, stated that personal branding should be done in an "organic, authentic, and holistic way," to make yourself "strong, distinctive, relevant, meaningful, and memorable."
So what can doctors, healthcare providers and hospitals do to brand themselves? Aurora provides these simple tips to doctors and healthcare providers to incorporate into their website and Internet marketing to promote themselves as personal brands.
Be the expert
Devote a section on your website to your successes and press mentions. Get the word out about how long you have been practicing in your particular field, and your experience with certain procedures. Numbers literally speak volumes with patients. In your website, promote the awards, certifications or distinctions you've earned.
Trust me
Experience means everything to patients so make sure your website clearly lists your training, board certification, experience, professional memberships in respected associations, safety records and hospital affiliations.
Manage yourself
What you don't know can definitely hurt you. With social media platforms, you can manage what patients are saying about you, and be able to respond accordingly. This can be done with a well-written, informative blog, Twitter, Facebook, LinkedIn and more. Link these directly from your website.
Optimization is optimal
A high search engine optimization (SEO) ranking is extremely important and helps build trust. Since patients start with word of mouth or the Internet, if you're at the top, they will find you and believe in you. Your website or blog must contain the appropriate keywords that are reviewed and updated regularly for freshness, credibility and visibility.
Get out there
Find events and venues where you can market yourself. We are talking medical conventions or panel discussions, or even keynote speaker opportunities. Media appearances are hard to come by, but if you have the right representation, you can garner a prime spot to promote yourself. Incorporate video feeds on your website from these events and appearances, or upload them to YouTube.com for added exposure.
Those who know, teach
Imparting your medical knowledge can help educate future doctors and providers. And your learning continues as well, because it increases your skill set and helps keep you up-to-date and current with what's new in the medical field.
About Aurora Information Technology, Inc.
Aurora Information Technology is a New York-based medical website design, content management, public relations, and healthcare marketing firm. A majority of Aurora IT's healthcare clients have experienced patient caseload growth of 20% or more, with some as high as 50%, after the creation and maintenance of an Aurora website. For more information about Aurora's medical logo, website design, and healthcare marketing services, call 1-914-591-7236, or visit them online at www.aurora-it.us.
Francis Marion University’s Information Technology Team was recently awarded the 2009 Innovation in Technology Award by the South Carolina Information Technology Directors Association (SCITDA).
Howard Brown, director of user services; Matthew Cantrell, director of network operations and services; Teresa McDuffie, network specialist; Robin Moore, director of campus applications and data services; and Kevin Torgersen, network administrator were presented the award during SCITDA’s Annual Conference in Columbia.
The Team award is designed to recognize excellence in collaborative and innovative technology-based projects. From e-mail to server enhancements to partnering with Google to provide students with FMU-branded Google Applications, the IT team has worked to establish a technology environment that challenges innovation, growth and service to its computing community while laying a foundation that can support emerging technologies and encourage pursuit of the same, said John Dixon, FMU’s chief information officer.
· Brown began his career at FMU in 1998, immediately after earning a bachelor’s degree from FMU.
· Cantrell joined the FMU staff in 2007. He earned both his undergraduate and graduate degrees from Clemson University. He completed his undergraduate work in architectural design in 1999 and his graduate work in computer science in 2002.
· McDuffie joined the FMU staff in 1998. She is a 1988 graduate of South Carolina State University, where she earned a bachelor’s degree in electrical engineering technology.
· Moore began working at FMU in 1987 and is a 1981 graduate of Francis Marion University.
· Torgersen came to FMU in 2007. He earned an associate’s degree in computer science from Spartanburg Methodist College and a bachelor’s degree in information management and systems from the University of South Carolina Upstate in 2005.
Francis Marion University, founded in 1970, is one of South Carolina’s 13 state-supported universities. As one of the state’s six comprehensive institutions, FMU prides itself on providing a strong liberal arts education.
The university enrolls nearly 4,200 students and offers a broad range of undergraduate degrees and a select number of graduate programs serving the needs of communities, businesses and industries of the Pee Dee. Francis Marion is the only state university serving the Pee Dee, and many of its students are the first in their families to attend college.
SCITDA, originally formed in 1978 as the South Carolina Association of Data Processing Directors (SCADPD), serves as a forum for the exchange of information pertinent to the management of State IT facilities while also providing a consolidation of experience, knowledge and interest in improvement of IT administration and management. Since its inception, the Association has grown from a handful of people to an organization that includes most State agencies, colleges and universities.
It wasn't long ago that universities were considered the poor cousins of big corporations when it came to information technology. Students learned how to program, got jobs at big companies and breathed a sigh of relief at just how quickly and efficiently technology was modernized and upgraded.
Fast forward a few years. Data center managers are now struggling to clean up layers of old servers and applications, and become more responsive to their business units. Universities, meanwhile, have rekindled their experimentation with technology, using it in ways that would likely make their corporate counterparts shudder.
Forbes caught up with Gerry McCartney, CIO at Purdue University, to talk about the changes under way in academia.
Forbes: What does the CIO job involve at a university?
Gerry McCartney: In theory, it's anything that's a production service. We support technology in the classroom, run administrative systems--administrative support and payroll--and we have all the teaching systems. We also support research computing.
Do universities make use of cloud computing like many companies?
Absolutely. We've been making very aggressive use of cycle sweeping, which is an earlier form of cloud computing, and grid computing. We have a virtual grid of 28,000 CPUs (central processing units), which draws waste cycles from five other institutions as well as Purdue. A repackaged version of that is what we think of as commercial cloud computing. With that you're not using specific machines. You're using a service that makes those cycles available to you.
How do you allocate that compute power?
If you think about payroll, you may run that every week or two, but you don't need to run it all the time. You can predict your demand and model that. What researchers want is as much capacity as possible. They'll consume whatever is available, so our research machines run at 95% capacity all the time. There is such demand that we start processing research jobs even as we're building new machines. But with research computing there's less demand to do it today. Payroll is a time-critical service. Research is not. So if I can't offer cycles on Friday, I might be able to offer them on Sunday, and for most jobs that's acceptable.
That's way above the highest utilization inside corporations.
If they have research departments, it's probably that high. But there's a lot of technology that's just sitting around waiting inside companies. What we do is stack up the demand in a queue and have them manage that queue. We tell researchers to buy the compute nodes, and we pay for everything else--the inter-networking fabric, storage and backup--but when they're not using their nodes, we want to be able to use them. If you bought 16 nodes on a big machine, you get to use them anytime. But you also can grab 32 nodes on a neighbor's machine. And when you're not using your 16 nodes, they're available to others.
Does this work on a global basis for companies, as well?
It can. The problem is if you're moving around large blocks of data. That can cause latency problems. If you have to move terabytes of data to Hong Kong, it can take hours and hours to get it there. Or worse, you send a couple gigabytes, and when you're done you have terabytes of output. That's hard to get back. The compute capacity is still somewhat ahead of the networking capacity. You still want storage near your compute environment.
Are there other technologies in use in universities that might apply in the corporate world?
At the retail end, we don't understand how to use social-networking technologies like Facebook, MySpace or Wikipedia in a productive way. They're still toys, and corporations have danced around the edges. But a lot of their employees are using these things, whether they like it or not. We've been experimenting with ways to take advantage of these new technologies and engage students in a whole new way of learning. The students are actually driving us in that direction.
It's a different baseline, right?
Yes. Try explaining to a student today what a phone booth is. They don't get it. We're going to see the same in education soon. The old one-to-many model, which is the traditional educational or broadcasting model of a talking head with no ability to interact, is being transformed. The best analogy is television news. You've still got the talking head in the middle of the screen, but you've got all this other action on the screen. You've got scroll bars on the side, you've got a ticker on the bottom. People can watch three or four different things simultaneously. The talking head is delivering information at one speed, and the banners and scrolling information at the bottom are being delivered at a different speed. Your brain can actually absorb that quite easily once you get used to it.
How does that apply in education?
I can be listening to a professor and at the same time have the equivalent of a Twitter line open to the teaching assistant where I can ask him a question about what the professor is talking about. The teaching assistant can answer the question in real time. The teaching assistants are being used to supplement education, and the faculty is watching these background conversations to see what's going on. If a whole group of students is asking a question about what is an internal rate of return, for example, the professor can stop and address that and then move on. We also have an algorithmic filter that removes all the noise on Twitter like, "Me, too" or, "I agree with that."
Are grades going up because of this?
We're not sure yet, but the old idea of, "Look left, look right and one of you won't be here next year," is an incredibly wasteful process at the institutional level and at the individual level. No one is thanking us if their kid lasts two years and then drops out. This allows us to say, "If we admit you and you're willing to work, you're going to graduate with a degree in a timely way." We're not going to try to make it hard for you. We're going to keep you focused and on task. We also use technology that allows us to detect in the first two weeks, based on the way you interact with online course material, whether you're at risk or not. We run intervention. The professor can say, "Come visit me after class, go visit the teaching assistant, or do this assignment over again." Students love this stuff.
When did you begin using this technology?
Just about two years ago. We have 8,000 students using it this semester. So we can't tell the ongoing impact yet. But we're watching it very closely.
India Department of Post Selects Accenture to Design and Develop New Information Technology Architecture and System
TheDepartment of Post (DoP) has awarded a 45-month information technology (IT)
modernization contract to Accenture (NYSE: ACN) to design a new enterprise IT
architecture and migrate the DoP to a more efficient, reliable and user-friendly
IT system.
Accenture also will advise DoP on the development of a wide-area network
environment that helps connect all post offices on which various online services
can run, and will study the feasibility of implementing an enterprise solution
for the department`s core banking and advanced financial services.
"The technology enablers will help DoP transform itself by increasing
operational performance and achieving efficiencies through "last mile"
connectivity," said Krishna G.V. Giri, who leads Accenture's Management
Consulting practice within its Health & Public Service operating group in the
Asia Pacific region. "Armed with greater speed, efficiency, and flexibility at
DoP, the government will be much better positioned to share various social
schemes, such as the Mahatma National Rural Employment Guarantee Scheme, with
even the most remote citizens."
Accenture`s work began in September with a business process re-engineering (BPR)
exercise across key departments and core operations, such as mail operations,
banking and advanced financial systems. Following the BPR exercise, Accenture
will help select and monitor a vendor to enable the DoP to consolidate its
technology infrastructure and applications.
The project is designed to help the DoP drive greater revenue and regain market
share in different services and products, including bill payment, e-posts, life
insurance, money transfer and banking.
According to Giri, DoP expects that the technology upgrade also will benefit
citizens via speedier and more efficient banking and insurance services, track
and trace abilities, and retail services. In addition, DoP will be able to
compete effectively against local and international courier companies and
increase revenue in the mail and logistics business.
About Accenture
Accenture is a global management consulting, technology services and outsourcing
company. Combining unparalleled experience, comprehensive capabilities across
all industries and business functions, and extensive research on the world`s
most successful companies, Accenture collaborates with clients to help them
become high-performance businesses and governments. With approximately 177,000
people serving clients in more than 120 countries, the company generated net
revenues of US$21.58 billion for the fiscal year ended Aug. 31, 2009. Its home
page is www.accenture.com.
Accenture
Puja Gupta, +91 98 1888 3851
puja.x.gupta@accenture.com
or
Lisa Meyer, +1703-947-3846
lisa.m.meyer@accenture.com
NEW YORK - Accenture PLC, a consulting and outsourcing firm, said Monday that it received a 45-month information technology modernization contract from India's Department of Post.
Financial terms were not disclosed.
Under the contract, Accenture will design a new enterprise IT architecture and move the Department of Post to a more efficient and reliable information technology system.
Accenture also will advise the Department of Post on the development of a network to help connect all post offices and study the feasibility of putting in place a system for the department's core banking and financial services.
Accenture shares rose 1 cent to $39.08 in morning trading.
To give a chance to the voters to have a closer watch on the election scene, the state Government launched a new service by integrating the election database with the recently started SMS-gateway service.
Voters can access information about their names on the voter list and their polling booth by sending their voter ID card number to 56300 from their mobile phones, using the SMS service. “In reply, they will be sent the desired information about name and number of the constituency, the polling station number, voter serial number and the polling station address,” said Principal Secretary, Information Technology, B K Aggarwal.
The codes to access the information will also be launched shortly so that voters of Rohru and Jawali constituencies can use the service. The SMS service will be put to test for the first time during the Rohru and Jawali byelections to be held on November 7.
The new service will also enable the public to get instant information updates on their mobiles through SMS about voter turnout at booth level on the polling day, the male-female voter ratio and overall voting percentage.
He said a format for giving more information such as hourly polling percentage on the voting day was also being worked out. “We have the infrastructure ready. It depends on the election observers and other officials that at what frequency they want to release this information. The information will be updated in the state election department’s database on the website so that the voters can access it through SMS any time,” said Aggarwal.
“BSNL users will be charged Rs 2 per message and others at the rate of Rs 3. The service provider will charge the Government for the replies sent,” said Aggarwal.
He said the service will also have utility during the counting of votes, where voters could get round-wise updates.
College students interested in computers and information technology are encouraged to apply by Oct. 22 for a new scholarship program at Chipola College.
Chipola is the recipient of a $50,000 grant to promote and fund "WIRED for Technology" scholarships for students enrolled in several computer-related programs.
The scholarships were awarded 13 students during the fall 2009 semester. Applications for the spring 2010 semester will be accepted through October 22.
Scholarships are available for Associate in Arts (AA) degrees in Computer Science and Information Technology. AA students typically take classes for two years and then transfer to a university to complete the junior and senior years of a bachelor’s degree.
Associate in Science (AS) degrees in Computer Engineering Technology, Computer Information Technology, and Networking Services Technology are also included among the eligible programs. AS degrees are designed to prepare students for careers after taking one to two years of coursework.
Scholarships are also available for Workforce Certificate program in Computer Systems Technology which can be completed in as little as one year.
The new Scholarships may fund all or part of costs for tuition, fees, textbooks and software for students in technology or computer-related majors or programs of study. The scholarships may be awarded in addition to other scholarships or grants; however, the selection panel will also consider other funds available to students through the Foundation or Financial Aid Office. The scholarships will be available only for the 2009-10 school year.
The WIRED for Technology project is being funded by the U.S. Department of Labor through a consortium of panhandle colleges in partnership with the University of West Florida and Florida's Great Northwest.
For information about the scholarships at Chipola, contact Gail Hartzog or Pat Barfield at 850-718-2342 or email: hartzogg@chipola.edu This e-mail address is being protected from spambots. You need JavaScript enabled to view it or Nancy Burns at 850-526-2761 or email: burnsn@chipola.edu This e-mail address is being protected from spambots. You need JavaScript enabled to view it .
The Minister of Information and Communication Technology (ICT), Aggrey Awori, has halted the process of procuring a firm to manage the National Data Transmission Backbone Infrastructure and E-Government Infrastructure (NBI/EGI) project.
According Mr Awori, the reasons for halting the process is based on the need for the involvement of National Information Technology Authority - Uganda (NITA-U).
“As you are aware, the National Information Technology Authority- Uganda (NITA-U) has been operationalised. The Board of Directors has been appointed and has already started carrying out their duties. The appointment of the Executive Director is being handled by the Board and the Minister. One of the key functions of NITA-U is to manage the government Information Technology (IT) Infrastructure including the NBI/EGI,” Mr Awori wrote to the ministry permanent secretary September 3.
He added “I am aware that the ministry has started the process of procuring a firm to manage the NBI/EGI. However, since NITA-U will be directly responsible for managing the NBI/EGI, it is essential that it participates in the process of procuring the said firm. Secondly, we need to rectify the damages that were caused on the optic fibre in Phase I in order to make NBI/EGI fully functional and in view of the above, I am asking you to halt the process of procuring a firm to manage the NBI/EGI on behalf of government until the two have been sorted out.”
However, Sunday Monitor, has learnt that Mr Awori on July 31 wrote to the manager Huawei Technologies confirming that his ministry had accepted Comtel Integrators Africa Limited, to be Microsoft partner in the NBI/EGI project .
“Further to our previous discussion and correspondence about Microsoft software for NBI/EGI, I wish to confirm that my ministry has accepted the nomination of Comtel Integrators Africa Limited to be the Microsoft partners in the NBI/EGI project, after the due diligence. You are authorised to start working with them expeditiously so that this project can be operational,” Awori wrote. But in choosing Comtel there are suspicions of conflict of interest because a senior manager at Comtel sits on the NITA-U board. Sources further allege that the ministry awarded this tender to Comtel in violation of government procurement guidelines that demand that such transactions be advertised.
Last week, this paper reported that there was a fight for senior jobs at NITA-U with Mr Ambrose Ruyooka, a commissioner in the ministry, being dropped from the board. But Mr Awori last week dismissed the rumours that he sacked Mr Ruyooka saying “There is no evidence to show that I sacked him apart from a letter I wrote to him discontinuing him from National Information Technology Board. The truth is that Mr Ruyooka was on two boards…”.
Following our story, the ministry appointed Mr Andrew Lutwama as an interim chief executive officer of the authority. Mr Edward Baliddawa, former chairman of the ICT committee in Parliament but now an ordinary member of the same committee expressed dismay at what is happening in Mr Awori’s domain.
“What is contained in the reports is disturbing and of great concern to all of us in the ICT fraternity, but more so to all of those colleagues who did unreservedly contribute to the process of the successful enactment of the NITA-U Act 2008” Mr Baliddawa’s email posted on I-network, a social networking forum for ICT specialists and which this paper saw on Tuesday. He further noted that “after considering all the contributions, the House passed a law establishing
NITA-U and gave specific guidelines as to the operationalisation of NITA. For example, the law is very specific on who should sit on the Board of NITA and how the Executive Director shall be chosen. The law specifies that among the 7 Board Members, the Ministry of ICT shall be represented by the Commissioner for IT.
When I read in the papers that Mr Ruyooka Ambrose had been appointed on the Board, my understanding was that he had been seconded on the Board on the basis that he was an Acting Commissioner for ICT in the Ministry. Membership on the board is on the portfolio he was holding and not as a person in the name of Ruyooka. I find it strange too; that the minister was never informed that Mr Ruyooka was on another Board although in the law this is not a basis for rejection.”