Friday, 19 December 2008

IT leaders share green-tech predictions for 2009

With the down economy and shift in the political climate, experts foresee green IT gaining momentum in '09

December 18, 2008

It's that time of year again to dust of the old crystal ball and put forth some predictions as to what 2009 holds for the world of green technology. Rather than leaving the speculation to myself, however, I once again decided to tap experts at a host of organizations on how they envision green IT evolving in the year to come.

Suffice to say that no one has dismissed the green-tech movement as a mere passing fad. Both economic and political conditions (e.g. President-Elect Obama's vision of a cleaner energy economy) will continue to drive vendors to develop greener wares and organizations across the board to embrace greener practices -- be it in the name of cutting costs, meeting environmental regulations, or simply "doing the right thing."

[ Stay abreast of green-tech news by subscribing to Ted Samson's free weekly Green Tech newsletter. ]

Green-tech predictions for 2009 follow:

robaldrich.jpgRobert Aldrich, director of datacenter solutions, Cisco

1. Green in the U.S. market related to IT will be replaced by more concise, definable concepts in the mainstream media, such as energy equivalents.

2. The first generation of IP-based energy management applications will be released from many major IT vendors.

3. IT's consumption profiles will start to be measured under new criteria with teleworking and cloud computing as the next "killer apps" (today's killer app: virtualization).

In summary, I would say that 2009 will see the emergence on a number of sleeper "greenies" who have been doing their homework diligently over the last two years. I firmly believe that many in IT want to do the right thing (in this case, the "right thing" meaning the altruistic thing) but simply lack the time and monetary incentive to do so. The issue is not so much the IT professional per se but the system by which our roles are incentivized.

I think 2009 will be about what I would call Green IT 1.0, and that is "What does Green IT mean to me? What can I do individually, professionally, above and beyond recycling at home, to feel like I'm part of the solution?" In a word, internalization and the realization that a "green" lifestyle is a choice and involves a series of educated trade-offs.

Conversely, a sound focus in a down market is to trim operating expenditure through incremental improvements to infrastructure and operations. With commercial energy market volatility, popular opinion, and geopolitical considerations at hand, the time to examine a sound energy strategy is now.

Subodh_Bapat.jpgSubodh Bapat, vice president and distinguished engineer, Sun Microsystems

1. Energy efficiency to reduce watts per compute workload will continue to be a priority in 2009, given high ROI and overall energy limitations.

2. Datacenter blueprints will continue to evolve with aggressive virtualization saving potentially between two and ten times the savings of facility efficiency measures.

3. Power management will be used to throttle down servers (and other IT equipment) when not in use, something not done effectively today.

4. High-temperature datacenters, some without any mechanical cooling systems, will be discussed more in 2009 (but very few will be implemented).

5. Collaboration will continue to drive progress forward through new standards (The Green Grid), best practices, and open tools ( as we scramble to meet energy and climate goals. Sun will continue to innovate on eco in hardware (servers and network), software (including efficient coding), services (datacenter design), and partnerships.

lewis_curtis.jpgLewis Curtis, infrastructure architect and advisor, Microsoft

1. 2009 and 2010 will see the start of IT organizations investigating environmental regulation strategies and increase environmental impact skill sets. Even if IT leaders don't care about environmental stewardship, they care about government regulations that impact IT operations. With the new presidential administration committing the United States to a environmental cap and trade model, the European Union promoting a datacenter code of conduct, and various government bodies promoting more oversight and environmental and energy ceilings, IT leaders will need to quickly become more cognizant with environmental regulations and work to form productive, environmentally sustainable strategies for their organizations.

2. In 2009, the economic downturn will greatly impact green IT investments. There is no doubt that organizations are reducing IT investments in light of the economic downturn. Many have argued that the reduced price of oil and economic pressures will kill the green movement. The death of the green movement in organizations and society has been greatly exaggerated. However, there will be some changes in green IT investment activity.

Environmental sustainability projects that positively impact the bottom line in the short run will be moved to the front of the line. Examples include virtualization and consolidation projects.

Environmental sustainability projects that increase costs of organizations or do not impact immediate regulatory needs will be delayed. For example, some recycling efforts (paper, e-waste projects) will probably not expand as much as originally anticipated in 2009.

3. 2009 will be the year of the green developer. Besides all of the "green is good for IT" articles, there has been a good deal of writing about building green physical datacenters as well as adopting virtualization. However, when analyzing different professionals in the IT market, developers were usually the most passionate about environmental impact. Yet, developers have the least amount of guidance on environmentally sustainable development best practices.

What are best practices to reduce energy and computational resource consumption for application design?

Sloppy code is wasteful. Not only is it slow, error prone, and often not extensible, it usually wastes energy and utilizes unnecessary computational resources. This has a significantly negative impact on the environment. However, most architects are given the "virtualize the problem away" answer for environmental sustainability.

4. In 2009, competition will increase for the green cloud. Who is the greenest cloud provider for your applications and solutions? Which cloud providers will report environmental metrics and provide concrete green operational level agreements for enterprises which you can use for verification in your own environmental reporting? Currently, we don't know how cloud providers will compete with green services. However, by the end of 2009, I predict we'll start finding out some answers.

(Read more of Curtis's 2009 green-tech predictions.)

albert_esser.jpgDr. Albert Esser, vice president of datacenter infrastructure, Dell

In 2009, energy efficiency in the datacenter will continue to be a big focus for IT departments. Not only will improvements in overall energy consumption help reduce power and cooling costs, but applying green practices in the datacenter can also unlock hidden assets -- space and compute power -- and extend the lives of their datacenters well past 2009. Here are five things that IT managers should consider in '09:

1. Industry standards: By not locking oneself into a proprietary solution, datacenter managers can stay up-to-date on improved designs on efficiency. Leveraging industry standards allows businesses to upgrade existing systems and swap out old ones for more efficient models with ease.

2. Productivity: By looking at the big picture in terms of productivity, managers will have a better sense of what levers they can play with beyond one specific area, allowing for better management of power consumption.

3. Virtualization: Though virtualization remains one of the hottest topics, the amount of server utilization has decreased over the years. This needs to change. Businesses can significantly boost productivity and energy efficiency by increasing the number of servers being virtualized.

4. Smart cooling: One of the biggest impacts on the environment is the use of cooling in the datacenter that can be overcome by moving cooling closer to the rack, understanding where hot spots are, and utilizing air economizers that use outside air to keep servers cool.

5. IT productivity: While improving energy efficiency through power and cooling methods offers significant returns, businesses can also optimize datacenter performance by focusing on IT productivity. The greening of the datacenter can be accomplished by ensuring servers are optimally utilized in order to reduce the amount of unnecessary power being consumed.

stevesams.jpgSteve Sams, vice president of site and facilities services, IBM Global Technology Services

This coming year, economic and budgetary concerns will impact the datacenter. With less money available, it's important that organizations get the most bang for the buck with their IT facilities and resources. By considering all options available in the software, hardware, and the virtualization realm, IT managers can run the most cost-effective, energy efficient datacenter possible.

Three things that an organization must look at during these troubled economic times are: Extending the life of their datacenter -- numerous opportunities exist throughout the datacenter to do more with less, rationalizing end-to-end datacenter infrastructure, and utilizing modular/scalable approaches when building a new datacenter. IBM is shifting from a custom to standard datacenter design business to apply these cost-cutting tools. In 2009, we will continue to see innovative implementations that will have larger impacts on an organization's bottom line results.

ted_samson.jpgTed Samson, senior analyst, InfoWorld

1. PC vendors will continue to compete to purge their new wares of toxic chemicals, such as polyvinyl chloride and brominated flame retardant. Moreover, they will release systems that meet or exceed the forthcoming Energy Star 5.0 standard before it officially goes into effect.

2. In the name of cutting energy costs, more IT shops will take the calculated risk of powering off at least some servers when they're not in use.

3. Adoption of PC power management software -- one of the lowest-hanging green-tech fruits -- will increase in 2009, saving companies as much as $75 per PC per year.

4. Vendors will roll out more products drawing on sensors that will measure such attributes as power consumption, temperature, humidity, and utilization. The purpose is to give datacenter operators a real-time picture of how efficiently their facility is operating at any point in time and to help locate hot spots and other areas of inefficiency.

5. More new datacenters will be built to comply with LEED (Leadership in Energy and Environmental Design) standards in an effort to boost energy efficiency and earn green bragging rights.

6. More companies will start to scrutinize the inefficiency of their supply chains and, using smart tech, will find ways to streamline operations and cut expenses, including fuel and packaging costs.

willswope.bmpWill Swope, vice president and general manager, Corporate Sustainability Group, Intel

Economic conditions today are complex enough that companies can no longer just talk the talk; they need to walk the walk in order to save resources in 2009 and beyond. The financial turmoil will certainly put additional pressure on these decisions but, when done correctly, sustainability is creating savings not adding expenses.

There will be real pressure on IT to reduce operating costs, much of which will be achieved by implementing projects such as more efficient datacenters. We also expect to see more computing resources used to improve efficiency of all aspects of a company's operations. Computers, and computing, are fundamental to increasing efficiency in most every area that consumes carbon.

The aggregate electronics industry will face an eventual growth hurdle if we can't come to an agreement on an industry-wide approach to dealing with e-waste. E-waste is one of the unintended consequences of the amazing innovation in our industry. The issue becomes more complex as the availability of new device capabilities and categories emerge. Beyond removing toxins from consumer electronics materials, we need to quickly come up with a solution that mitigates dumping overseas and encourages reuse and recycling.

Industry leaders and government officials will determine which stimulus plans have far-reaching, positive impacts on reducing carbon footprints. We believe this objective is imperative to keep moving forward, but we also need to define long-term, systemic changes to many of the "assumed rights" regarding how we, as individuals, consume resources today. Reform in this regard along with the stimulus packages will in effect help enact real change for a better environment.

RogerTipley.jpgRoger Tipley, director, The Green Grid

The incoming presidential administration is expected to bring a renewed commitment to addressing energy management, leading the U.S. government to increased collaborative efforts with industry organizations that are dedicated to advancing energy efficiency and cost savings in datacenters throughout the nation. Building upon existing energy policies, The Green Grid expects government agencies, such as the U.S. Department of Energy and Environmental Protection Agency, will work even more collaboratively with industry organizations to develop more effective ways to improve energy efficiency in datacenters. This sustained, shared effort between government and industry designed to improve U.S. datacenter efficiency in 2009 will also have global implications in the years to come.

Vertal.jpgLarry Vertal, senior strategist, AMD

This coming year, U.S.-based corporate IT managers will start getting the first indications of another budget to deal with sooner rather than later: Their carbon budget.

Corporate management will start to see carbon accountability waterfall down into their planning in anticipation of the impact of federal and state legislation and regulations.

California organizations, along with those in other western states, are beginning to define what the AB32-mandated carbon cap-and-trade program means for their businesses. These companies would acquire annual allowances to emit a certain amount of CO2 and other GHG (greenhouse gas) emissions based on specified criteria. They would then have three options: 1) emit the amount of GHG emissions allowed by their permit or allowances, 2) reduce their own GHG emissions and sell excess allowances to other emitters, or 3) emit more GHG emissions by purchasing unused permits or allowances from another emitter.

President-elect Barack Obama reaffirmed his commitment to adopting a cap-and-trade carbon program as part of his national climate change policy shortly after the election. He went on to say the United States must reduce carbon dioxide emissions 80 percent by 2050 -- which is in line with proposals by the U.N.'s Intergovernmental Panel on Climate Change.

The good news for IT and facilities management is that the major factor in their own carbon budgets will be driven by electricity consumption. Over the last few years, many IT and datacenter operations have focused on getting an integrated view of and optimizing their energy consumption, which will become the foundation for managing carbon budgets. The challenging part will be the extent to which such carbon accounting may expand to include taking into account what is often called embedded-energy, and hence the embedded-carbon-footprint of equipment purchased.

Stay tuned, IT, it could be a wild ride.

Doug-Washburn.gifDoug Washburn, analyst, Infrastructure and Operations, Forrester Research

1. Expect the PC environment to steal much of the green IT spotlight away from the datacenter. Why? While the datacenter is often a first target on organizations' green IT hit list, recent data reveals that the distributed PC environment is likely to be consuming more than the datacenter. With that in mind, expect IT ops professionals to aggressively pursue PC power management best practices and invest in software to assist (e.g. 1E's NightWatchman, BigFix Power Management, Verdiem's Surveyor). Beyond the reduction in CO2 emissions, the financial savings can add up: General Electric and Dell boast savings of $2.5 million and $1.8 million per year, respectively.

2. Expect the traditional definition of green IT to be refined. Today's green IT primarily focuses on the "greening" of IT itself -- such as sourcing Energy Star PCs or virtualizing servers. While the traditional view of green IT will become pervasive, the positive environmental -- and financial benefits -- of IT as enabler of the "Green Enterprise" will be much more profound than IT just greening its datacenter or PCs. With that in mind, tomorrow's green IT will be defined much more broadly to position technology as an enabler of the "Green Enterprise." Early examples of this include Nike's "Considered Index" desktop application which empowers designers to make more eco-friendly decisions when making shoes, and UPS's "package flow" software to eliminate left-hand turns from delivery routes which saved $8.4 million in fuel costs and 32,000 metric tons of CO2 emissions in 2007.

What do you predict will happen in the realm of green tech in 2009?

Posted by Ted Samson on December 18, 2008 03:00 AM

Cable Industry Keeping Quiet Before Digital TV Switch

The group's trade association said it will cease moving analog channels to digital channels for the next three months and let consumers concentrate on the Feb. 17 deadline.

Hoping to avoid confusion and criticism during the U.S. switch to digital television next year, the cable industry this week announced efforts to keep out of consumers' and the Federal Communications Commission's hair.

The group's official trade association on Thursday told Congress that its members will cease moving analog channels to digital channels during a "quiet period" between Dec. 31 and March 1 to help allay consumer confusion in the coming switchover to digital TV that's scheduled to take place next year. The switchover, mandated by the Digital Television Transition and Public Safety Act of 2005 and moderated by the FCC, requires full-power TV stations to switch to digital broadcasts on Feb. 17.

"Most channel migration from analog to digital broadcast basic or expanded basic" will stop during that period, the National Cable & Telecommunications Association said in a letter to House and Senate leaders overseeing the move. The NCTA move will help consumers concentrate on preparing for the switch.

While the two events -- the switchover and cable companies' quiet period -- aren't related to each other, the nonaction by the cable companies will remove some of the confusion of the switchover, which is already causing confusion among consumers. Consumer groups, which have long complained about high cable prices, have also been complaining that cable companies were taking advantage of the confusion to jack up prices.

"We recognize that the overlap between cable's digital migration and the broadcasters' DTV transition scheduled to occur on February 17, 2009, inescapably adds a layer of complexity and the potential for consumer confusion," wrote Kyle McSlarrow, president and CEO of the NCTA, in the letter to congressional leaders this week.

"And," McSlarrow continued, "on the assumption that not all issues with the DTV transition will be resolved as of February 17, 2009, for at least three months thereafter we will, upon request, offer free equipment to analog able households for at least a year, and will ensure that such households can receive channels moved from analog to digital broadcast basic or expanded basic tiers without incurring additional service charges."

To help consumers with the transition, the government is issuing coupons, which have a value of $40 toward the purchase of a converter box. The Commerce Department's National Telecommunications and Information Administration this week urged consumers who rely on antenna-delivered analog broadcast TV to immediately apply for converter box coupons or risk the possibility of losing TV signals when television stations switch to digital transmission.

Apple Buys $4.8 Million Stake In Mobile Chip Maker

Apple is a licensee of Imagination's technology, which could play a bigger role in Apple products in the future as a result of the investment.

Apple has bought a stake in a company that makes semiconductors for mobile phones, portable media players and other consumer electronics, increasing Apple's investment in chip makers with technology that could fit in the company's iPhone and iPod product lines.

The investment, amounting to 8.2 million shares of Imagination Technologies Group, is roughly a 3.6% stake in the United Kingdom-based company, Imagination said in a statement released Thursday. Apple paid 58 cents a share, making the deal worth about $4.8 million.

The tech chief of trucking company Con-Way talks about her defiant nature and how it helps her to succeed in IT.

Apple is a licensee of Imagination's technology, which could play a bigger role in Apple products in the future as a result of the investment. Apple was not immediately available for comment and rarely if ever discusses future product releases.

Imagination makes semiconductors for a variety of multimedia and communication applications, including digital radios, personal media players, car navigation systems, mobile Internet devices, ultra-mobile PCs, digital televisions and set-top boxes. Apple products that could use such technology include the iPhone 3G, iPod Touch and Apple TV.

It's not the first time Apple has put its money into chips. In April, the company bought PA Semi, a maker of low-power PowerPC processors. PA Semi designs the products, but manufacturing is outsourced.

Industry veteran Dan Dobberpuhl founded PA Semi -- the 150-person company in 2003. Dobberpuhl is the acclaimed lead designer of the DEC Alpha series of microprocessors, the StrongARM microprocessor, and the first multicore systems on chip with the SIByte 1250.

Tuesday, 16 December 2008

When to Upgrade Your Computer Hardware

Computer consultants build entire careers around advising

businesses when to upgrade their hardware. Should you go with the latest and greatest, or stick with tried and true? As with all business decisions, it comes down to a question of cost vs. benefit. But quantifying the costs and benefits of hardware can be difficult. Here are some factors to consider when you are agonizing over whether to upgrade.

The hidden costs of upgrading. The price tag of your new system isn't the only cost ? there is also the time, energy, and money to migrate your information to your new equipment.

If you're thinking of upgrading just so you can have the latest gear loaded with all the bells and whistles, stop. Unless you have a solid business case for upgrading, your money will be better spent elsewhere.

Stopgap measures. If it's bells and whistles you want, maybe you can add them yourself. Adding additional RAM ? or Random Access Memory, which is the memory that allows your computer to perform its tasks ? is a great way to speed up your system, and it's really simple, even for neophytes. Most RAM retailers, such as Crucial and TigerDirect, have online configuration calculators to tell you exactly which RAM your system needs. Once you get the right RAM, it's simply a case of opening your computer case and snapping it in place.

You can also add additional devices, such as CD drives and burners and additional hard drives, but these are a bit more complicated than the memory upgrade described above. If you can't perform the upgrade yourself and need to hire a professional, weigh the costs carefully. Once you factor in the cost of the parts and labor, you may be better off buying a whole new system.

When to upgrade. The rule of thumb should be this: Upgrade when the cost of not upgrading exceeds the cost of upgrading. New hardware should help you work faster and more efficiently. Or maybe you need to upgrade your hardware to run new software applications that will improve productivity. If that's the case, upgrading is your best bet. Similar situations include a broken PC, one that crashes regularly, or otherwise keeps you from doing the work you need to do. Clearly, in each of these cases, it will cost you more to put off the upgrade than to go ahead with it.

If you've crunched the numbers and find you really do need to upgrade, don't rush out to buy the coolest, fastest, priciest computer on the lot. The best way to put off the inevitable obsolescence of your next computer is to make sure it meets all your business needs.

Take a look, not only at your current computing needs, but also at what your future requirements might be. Will you need a full-featured database program in the future? Will you run memory-hogging graphics programs or other special applications? Doing a little research at this stage may just save you a lot of money down the road.

Buying Your Computer Equipment from an Office Supply Store

You can buy computer equipment from a number of places, including

mail order catalogs, Web sites, local computer retailers, electronics "superstores," and office supply stores. Each source offers its own advantages and disadvantages. When you order directly from a Web-only manufacturer such as Dell or Gateway, you get competitive prices as well as the ability to specify exactly how the vendor should configure your computers. When you buy from a local retailer, on the other hand, you might get better service and support from a company with whom you have a personal relationship.

Where do office supply stores fall on this spectrum? In many cases, you'll find a reasonable selection of computers at a good price. Many office supply stores have their own computer service and support departments, along with special services such as extended warranties and training classes. Most stores offer other types of office equipment, such as cash registers and copiers, giving you the luxury of one-stop shopping.

On the downside, many office supply stores treat computer sales as just one part of a much larger business that includes everything from PCs to paper clips. You may find that a store sells most of its systems in fixed configurations, which means that you'll pay more if you want to buy customized equipment — or have to pay for features you don't really want. In addition, some office supply stores can't offer the same level of service and support that you'll get from other sources.

When you decide where to buy your computers, remember that in today's intensely competitive market computer prices and configurations change almost weekly. Don't buy a computer anywhere until you've had a chance to shop around.

Enhancing Video for the Visually Impaired

Researchers are using algorithms that can better the picture quality for people with macular degeneration.
Thursday, December 11, 2008
By Brittany Sauser

Eli Peli, a researcher at the Schepens Eye Research Institute in Boston, is developing software that can enhance the quality of a TV image for people with visual impairments such as macular degeneration--a disease that makes images on the screen seem blurred and distorted.

Peli's algorithms increase the contrast of a picture over spatial frequencies that are easier for a visually impaired person to see. In his lab a remote control can be used to adjust the contrast on a 32 inch television screen connected to a PC, creating a specially-enhanced picture.

"It's simple," Peli says, showing me CNN, the movie Shrek and a basketball game all in split-screen mode. In each clip he points out the difference in resolution, even for a person with normal eyesight. In the image on the right, details like grass, flowers and a person's facial features are much clearer than in the one on the left. Peli, who is also a professor of ophthalmology at Harvard Medical School, expects a grant from Analog Devices in the new year. This company has been testing his software and Peli says it is eager to build it into its hardware. He explains his work and demonstrates the system in the video below.

Video by Brittany Sauser

NASA's Future May Depend on Collaboration

An independent report addresses tough policy questions faced by the new administration.
Tuesday, December 16, 2008
By Brittany Sauser
An astronaut is anchored on the Space Shuttle Endeavor's robotic arm and is preparing to be elevated to the top of the Hubble Space Telescope during a servicing mission in 1993. Credit: NASA

In a report called The Future of Human Spaceflight MIT's Space, Policy, and Society Research group has produced some clear advice for the next President regarding manned space exploration. The report addresses such pressing issues as the retirement of the space shuttle, use of the International Space Station (ISS) and strategies for reaching the moon and Mars.

A key message is the discrepancy between NASA's current funding ($17bn per year) and the ambitions outlined in President Bush's vision for space exploration from 2004. "Trying to do too much with too little is exactly what caused the last two shuttle accidents," says lead author David Mindell, a professor of engineering systems and director of the program in Science, Technology and Society at MIT. But a lack of funding is hardly a new problem for NASA. So perhaps the most significant aspect of the report is its call for greater international collaboration, most notably with China.

The report states that the U.S. needs to reaffirm its international leadership while remaining committed to international partnerships. Specifically, the MIT team says we need to begin engagement with China as this could yield "enormous" benefits for both sides. Cooperation, the report says, "could encourage the Chinese to open their space program and help end speculation about their intentions in space," adding that doing so could help avoid a potentially dangerous space arms race.

The comprehensive study comes at an ideal time for president-elect Barack Obama. Once he takes office in January he'll have just 100 days to determine the fate of the US space program, which is facing its biggest crossroads since the end of the Apollo era in the 1970s. (To complicate matters, the Orlando Sentinel is reporting that NASA administrator Mike Griffin is refusing to cooperate with Obama's transition, although Griffin has denied the accusation.)

The MIT report was written by engineers, policy analysts and even a former astronaut. It starts by defining primary objectives (those that can only be accomplished by having human beings in space and are worthy of the risks and costs) and secondary objectives (benefits that accrue from human presence but do not themselves justify the cost and risk). It also includes some specific recommendations for the new administration.

To start, it says the U.S. should continue flying the Space Shuttle until the ISS is finished, even if that slips somewhat past 2010. Retiring the shuttle after this date will mean relying on international partners, particularly the Russians, for transportation to the ISS but the report says we need to trust their commitment to the project.

Secondly, a "major question facing the new administration" is how to utilize the $100 billion space station. The report suggests that operations should be extended to 2020 and should support the primary objectives of exploration: "research in the physical sciences, life sciences, development of technologies to support moon missions and long duration Mars flights, and as a laboratory for space technology development." Already NASA is testing a water processing system, work-out equipment, and living quarters that will turn the ISS into a six crew vessel instead of a three crew one by May 2009.

For the moon and Mars, the report calls for a strategy that first establishes the size and duration of any U.S. lunar presence and balances this with reaching other destinations such as Mars. Overall, it argues that the policy should be more, not less, ambitious but it also makes a strong call for employing space robotics.

The report will be published in greater depth and detail by the American Academy of Art and Sciences in early 2009. Let's hope the Obama administration reads through the report carefully. Reportedly, the transition team has already "enthusiastically received" it.

Semantic Sense for the Desktop

A project brings Semantic Web technology to personal documents.
By Erica Naone
Tuesday, December 16, 2008

People naturally group information by topic and remember relationships between important things, like a person and the company where she works. But enabling computers to grasp these same concepts has been the subject of long-standing research. Recently, this has focused on the Semantic Web, but a European endeavor called the Nepomuk Project will soon see the effort take new steps onto the PC in the form of a "semantic desktop."

Those working on the project, coordinated by the German Research Center for Artificial Intelligence (DFKI), have been toiling for three years to create software that can spot meaningful connections between the files on a computer. Nepomuk's software is available for several computer platforms and now comes as a standard component of the K Desktop Environment (KDE), a popular graphical interface for the Linux operating system.

The idea of a semantic desktop is not new. The Open Source Applications Foundation and SRI, two nonprofit organizations, have both worked on similar projects. But previous efforts have suffered from the difficulty of generating good semantic information: for semantic software to be useful, semantic information needs to be generated and tagged to files and documents. But without useful applications in the first place, it is hard to persuade users to generate and tag this data themselves.

Nepomuk is distinguished by a more practical vision, says Ansgar Bernardi, deputy head of knowledge management research at DFKI. The software adds a lot of semantic information automatically and encourages users to add more by making annotated data more useful. It also provides an easy way to share tagged information with others.

The software generates semantic information by using "crawlers" to go through a computer and annotate as many files as possible. These crawlers look through a user's address book, for example, and search for files related to the people found in there. Nepomuk can then connect a file sent by a particular person with one related to the company that person works for, making Nepomuk a particularly useful way to search a computer, Bernardi says.

While most operating systems let users search on their computer by keyword alone, Nepomuk can uncover more useful information by focusing on the connections between data; it can locate relevant files if they don't mention the keyword used to search. And peer-to-peer file-sharing architecture built into the system also makes it easy to share files and the associated semantic data between users.

"This might be the semantic desktop that actually survives," says Nova Spivack, CEO and founder of Radar Networks, the company behind Twine, a semantic bookmarking and social-networking service. "There's a lot of potential to build on what they've done."

Spivack notes that other efforts to bring semantic technology to the desktop haven't succeeded in reaching end users. "Nepomuk is designed for real people and developers," he says. For this reason, Spivack sees the inclusion of Nepomuk in KDE as particularly important, since KDE software is widely distributed and can easily be modified by software developers.

Although funding for the official Nepomuk project ends this month, Bernardi expects it to continue as an open-source software effort. A spinoff company is also in the works, he says, and a newly founded legal body called the Open Semantic Collaboration Architecture Foundation will help coordinate continuing work on the technology created by Nepomuk.

Nepomuk's software is available in several platforms besides KDE. Users can download the basic software for free for Windows, Macintosh, and Linux. It is also possible to use Nepomuk in a more limited way--just for Web pages viewed through Firefox, for example--with a limited installation.

Template by - Abdul Munir | Daya Earth Blogger Template