- 'Butterfly Net' to Collect Space Junk
- DARPA is sponsoring a project to build a butterfly-net-like machine to capture space junk—the massive amount of material left over from rogue satellites, old spaceships and other defunct manmade objects.
A team of scientists is rethinking the concept of the butterfly net on a much larger scale. They've devised a netlike machine to catch all of the dangerous trash that's swirling around our planet.
Space junk includes all of the now-defunct manmade materials that orbit our planet. It can include anything from trashed rocket stages and obsolete satellites to fragments left over from explosions.
A huge amount of manmade debris currently orbits Earth (source: NASA).
While space junk might sound harmless, keep in mind it's whirling around at thousands of miles an hour. At that speed, even a tiny speck of dust can cause significant damage to spacecraft—an especially troublesome risk for solar panels and protective shielding, which guards a craft from space's extreme temperatures and radiation.
A recent NASA study addressed just how hazardous space junk can be. "The greatest risk to space missions comes from non-trackable debris," says Nicholas Johnson, NASA chief scientist for orbital debris. Such declarations have led to increased research on solutions for preventing collisions with space junk, and collecting the rubbish.
Star, a Mount Pleasant, S.C.-based technology company, is working on a machine called the ElectroDynamic Debris Eliminator (EDDE). The project, which is funded by U.S. Defense Advanced Research Project Agency (DARPA), aims to collect space junk in an efficient and complete manner.
The EDDE vehicle is equipped with 200 nets that can collect garbage currently in low-Earth orbit. Jerome Pearson, president of Star, claims that, over seven years, just 12 of these vehicles could collect all 2,465 identified objects weighing over 2 kilograms that are floating around the planet.
Even tiny bits of debris can cause substantial damage to spacecraft. Here, a large pit has been made in a space shuttle's window (source: NASA).
Once a piece of debris has been collected, the EDDE has several disposal options. For example, the vehicle could send an object toward Earth, where it will land harmlessly in unpopulated regions of the South Pacific. It can also bring an object into a closer orbit, which will eventually cause most pieces of debris to burn up in the atmosphere.
The most exciting disposal option, however, is the possibility of space recycling. The EDDE could salvage materials from debris and manufacture new, working parts. "So you'd be mining aluminum in orbit," Pearson states. According to Star, just four of the machines could collect enough material to build a structure as big as the Washington, D.C. Smithsonian Air and Space Museum—that's nearly 200,000 square feet.
Despite these exciting possibilities, the project faces several challenges. First, having many EDDEs zooming around Earth would require more regulation for safety purposes. "We may need space traffic control," Pearson posits. The U.S. Federal Aviation Administration has already begun research on how it might standardize space flights. For example, vehicles like the EDDE might someday be required to file flight plans, just like airplanes do.
Another noteworthy concern is that the vehicle could be used for military functions, such as disabling other countries' satellites. This possibility already has Chinese officials worried. To combat these concerns, Star hopes to move its project from the Department of Defense to NASA. Pearson even hopes that the machine might eventually be governed by the United Nations, which could regulate international space clean-up efforts.
Star has already begun testing its vehicle, which should commence test flights in 2013. The company plans to start removing space junk by 2017.
A simulation of how 12 EDDEs could effectively clean the space around Earth can be found here.
Tuesday, August 31, 2010
- In creating a new biofuel, several Scottish scientists are turning to their country's favorite beverage: whisky. The new fuel, which uses whisky byproducts, can be used in normal car engines alone or with a mix of gasoline.
A team of scientists at Edinburgh Napier University in Scotland has invented a new type of powerful biofuel made from an unlikely substance: whisky.
Setting out to find new sources of biobutanol—a new kind of fuel 30 percent more powerful than ethanol—the scientists looked to whisky. The potential market for a whisky fuel is huge: Worldwide, the production of the popular liquor makes up a $5 billion industry.
The new biofuel uses the two main byproducts of whisky production, pot ale and draff. Pot ale is the fermented liquid left over in the stills. Usually thrown out, the pot ale is sometimes used to feed livestock. Draff, also used as animal feed, is the grain residue. Each year, the malt whisky industry produces massive quantities of these byproducts: 1,600 million liters of pot ale and nearly 200,000 tons of draff.
The lead researcher on the project is Professor Martin Tangney, director of the Biofuel Research Centre at Edinburgh Napier University. "The EU has declared that biofuels should account for 10% of total fuel sales by 2020," he said in statement. "We're committed to finding new, innovative renewable energy sources. While some energy companies are growing crops specifically to generate biofuel, we are investigating excess materials such as whisky byproducts to develop them. This is a more environmentally sustainable option and potentially offers new revenue on the back of one Scotland's biggest industries. We've worked with some of the country's leading whisky producers to develop the process."
An especially exciting aspect of the whisky biofuel is that it can be used in a car's current engine by itself or mixed with other fuels.
Martin Tangney is the lead researcher behind the Edinburgh Napier University project to make biofuel from whisky byproducts (source: Reuters).
Just mixing a small percentage of the biofuel into normal gasoline could have a positive environmental impact.
"Five or 10 percent [of biofuel] means less oil, which would make a big, big difference," Tangney said.
Butanol fermentation, the method by which the biofuel is created, was pioneered by Chaim Weizmann, the World War II refugee who became the first president of Israel. The technology was originally used to produce rubber synthetically.
In a statement, Jim Mather, the Scottish Minister for Enterprise, Energy, and Tourism, sounded enthusiastic about the project. "I support the development and use of sustainable biofuels," he said. "This innovative use of waste products demonstrates a new sustainable option for the biofuel industry, while also supporting the economic and environmental objectives of the Scottish Government's new Zero Waste Plan. In these challenging economic times we need to play to our strengths and take advantage of the low carbon opportunities of the future. It's exactly this type of innovation that will help sustain economic recovery and deliver future sustainable economic growth."
Whisky isn't the only alcohol that's undergoing a green makeover. Recently at Smarter Tech, we explored how beer companies are working to use less water in making their products.
- Scientists Create MRSA-Killing Nanotech Coating
- A team of researchers at the Rensselaer Polytechnic Institute has developed a material that safely kills MRSA bacteria on contact. The material could be used to coat surgical equipment, hospital walls and other high-risk surfaces.
Methicillin-resistant Staphylococcus aureus (MRSA) is a dangerous infection caused by an antibiotic-resistant strain of staph bacteria. Sometimes called a "super bug," MRSA can be life-threatening and is especially common in hospitalized patients who have undergone surgical procedures. As occurrences of the deadly infection are on the rise in hospitals, doctors and scientists are desperate for preventative measures. In an exciting breakthrough, a team of researchers at the Rensselaer Polytechnic Institute has developed a nanoscale coating that safely and effectively kills MRSA bacteria upon contact.
The material, which is based on an enzyme found in nature, showed remarkable results in trial studies, in which latex paint was laced with it and painted onto a surface: 100 percent of MRSA bacteria were killed within 20 minutes of contact.
MRSA is especially dangerous because it is resistant to antibiotics. Currently, no standard treatment path exists (source: CDC).
"We're building on nature," said Jonathan S. Dordick, the Howard P. Isermann Professor of Chemical and Biological Engineering, and director of Rensselaer's Center for Biotechnology & Interdisciplinary Studies, according to the university's statement. "Here we have a system where the surface contains an enzyme that is safe to handle, doesn't appear to lead to resistance, doesn't leach into the environment, and doesn't clog up with cell debris. The MRSA bacteria come in contact with the surface, and they're killed."
In the material, carbon nanotubes are combined with lysostaphin, a naturally occurring enzyme that non-MRSA strains of staph use to defend against the bacteria. The nanotube-enzyme mixture can be combined with many different types of surface coatings, such as paint.
"We asked ourselves: Were there examples in nature where enzymes can be exploited that have activity against bacteria?" Dordick said.
The nanoscale material pictured here is a mixture of carbon nanotubes and lysostaphin, a naturally occurring enzyme that fights MRSA (source: Rensselaer/Ravindra C.Pangule and Shyam Sundhar Bale).
The answer to this search was lysostaphin, which is produced by non-pathogenic strains of staph that are harmless to humans.
"It's very effective. If you put a tiny amount of lysostaphin in a solution with Staphylococcus aureus, you'll see the bacteria die almost immediately," said Ravi Kane, a professor in the Department of Chemical and Biological Engineering at Rensselaer.
"Lysostaphin is exceptionally selective," Dordick said. "It doesn't work against other bacteria, and it is not toxic to human cells."
Combining the enzyme with carbon nanotubes increases its ability to reach the harmful bacteria. "The more the lysostaphin is able to move around, the more it is able to function." Dordick explained.
The coating, which can be painted onto various hospital surfaces, kills 100 percent of MRSA bacteria on contact (source: Rensselaer/Ravindra C.Pangule).
Most antimicrobial agents lack effectiveness or are dangerous. Some leach into the environment, posing harmful side effects, while others clog up and lose their effectiveness over time. The anti-MRSA coating does neither.
"We spent quite a bit of time demonstrating that the enzyme did not come out of the paint during the antibacterial experiments. Indeed, it was surprising that the enzyme worked as well as it did while remaining embedded near the surface of the paint," Dordick said.
Although MRSA is resistant to man-made antibiotics, it is unlikely to become resistant to lysostaphin. "Lysostaphin has evolved over hundreds of millions of years to be very difficult for Staphylococcus aureus to resist," Kane said. "It's an interesting mechanism that these enzymes use that we take advantage of."
The researchers envision all high-risk hospital surfaces coated with the material, which can be washed repeatedly and has a dry storage shelf life of six months. The coating could save lives and reduce hospital stays for thousands of patients.
August 31, 2010
FAITH IS A COMBINATION OF THOUGHTS AND ACTIONS.
When you apply your faith in yourself, your faith in your fellow man, and your faith in God, the result is a positive course of action that when persistently followed will almost always lead to success. When you believe in your ideas and in your abilities, and you trust in the Infinite Intelligence of the universe, you know that your thoughts and deeds will ultimately lead to a successful conclusion. You cannot fail.
This positive message is brought to you by the Napoleon Hill Foundation. Visit us at http://www.naphill.org. We encourage you to forward this to friends and family. T
Monday, August 30, 2010
Ring! It's Gmail's new voice feature calling
Taking a stab at Skype, Google enables Gmail users to make phone calls over the Net
Google announced Wednesday it is offering the ability to make phone calls over the Internet via its popular Gmail service.
Unlike Google's nearly two-year-old Gmail voice and video chat, which gives users an audio and visual experience online, the new calling feature allows users to dial phone numbers. With this move, Google is competing with Skype , which has long dominated this area.
"Starting today, you can call any phone right from Gmail ," wrote Robin Schriebman, a Google software engineer in a blog post . "We've been testing this feature internally and have found it to be useful in a lot of situations, ranging from making a quick call to a restaurant, to placing a call when you're in an area with bad reception."
Schriebman explained that making a phone call through Gmail works just like a normal phone. Users can click "Call phone" at the top of their chat list and enter a number or a contact name. She added that calls to anyone in the U.S. and Canada will be free "at least for the rest of the year." She said "very low rates" have been set up for calls to other countries .
So, does Google have the muscle to make Gmail a Skype killer?
Skype, a 7-year-old company, is used by individuals and companies to make video and voice calls over the Internet. According to Skype, its users made 6.4 billion minutes of calls in the first half of 2010.
While Google may be starting out behind in this competition, it has the benefit of its large Gmail user base.
"Skype could get hurt by this," said Dan Olds, an analyst with The Gabriel Consulting Group. "Skype has been offering the ability to call land lines and cell phones for years now. But having it integrated into Google's Gmail and, assumedly, their other offerings down the road, is quite an extension for Google."
Olds added that Google, always on the lookout for new streams of revenue, is looking to expand its reach over their customers and to move into complementary markets that will draw more revenue.
"Adding voice calls to their existing product set enhances the user experience and keeps people using Google apps longer and more frequently," Olds said. "It also keeps people from using another service like Skype, and it certainly may prompt some defections from Skype. Google definitely has the scale and reach to put a big dent in Skype if Google can deliver on the service side."
The voice calling feature is expected to be rolled out to U.S.-based users over the next few days, according to Google. Users will need to install Google's voice and video plug-in and watch for the "Call Phones" button to appear on their chat list.
Thursday, August 26, 2010
- Android Apps to Lend a Hand on the Battlefield
- Android apps may be great for workplace productivity and fun and games. But a tech-product firm is now developing Droid solutions that are expected to help military troops in the heat of conflict.
For the general public, top Android apps include those that allow you to manage work docs, grab stock quotes and integrate your Android device with iTunes.
But for U.S. servicemen and women serving in global hot spots, Android mobile apps may soon provide a greater purpose—such as allowing a soldier to track down squad members after a battle to make sure they’re safe.
At least that’s what PDT, a Lake Zurich, Ill.-based global product development firm, is working on. The firm views Android mobile apps as the next great tool for military customers, as the Android platform’s continued expansion of its hardware/software presence allows for customizable options that are well suited for Armed Forces operations.
“The U.S. military’s embrace of Android has given us the tools necessary to make applications useful for soldiers,” says Jim Curtin, Manager for PDT’s Defense Systems Program, according to a PDT press release. “These tools will allow for intuitive action in stressful situations, and will be inventive compared to current and previous approaches.”
Here are concepts now being hatched in the PDT lab:
ROVER Viewers: An app that would stream video feeds from unmanned aerial vehicles (UAVs) and fighter aircraft to dismounted soldiers. It would allow for the ability to record, playback, and other functions.
Situational Awareness Applications: These apps would let soldiers track and monitor the condition of other squad members in the thick of battle.
Squad TOE Management: This would be a logistics app that would monitor how much food/water/ammo is available for a squad in the field, and what the current consumption rate is.
There are key factors driving the need for Android application development for the military community, Curtin says. First, Android’s flexible Linux structure allows developers to program powerful applications that can support a variety of interfaces—anything from disaster-relief optimization to supplier integration and wartime navigation. Android is also designed for control, which is always important when it comes to battlefield operations.
“The absence of a closed-vendor application distribution system—along with the ability to apply modifications to program source code at will—has made Android the most transparent platform available,” Curtin says. “Additionally, potential government cost savings realized from the transition could see reinvestment in ground support where our troops need it most.”
Also, the increasing scope and complexity of military software over the last decade had previously made it almost impossible to port existing methods to mobile environments. Android apps, however, have proven to be more adaptive to military needs.
“Whereas before, most soldiers were stuck with a multitude of devices—each for a separate function—now they can combine them all into a single unit that fits neatly in their uniforms,” states Curtin. “They can rely on a tool that manages humanitarian and disaster relief missions, intelligence and surveillance concerns, translation, and planning/logistics coordination.”
Tuesday, August 24, 2010
5 indispensable IT skills of the future
In the year 2020, technical expertise will no longer be the sole province of the IT department. Employees throughout the organization will understand how to use technology to do their jobs.
Yet futurists and IT experts say that the most sought-after IT-related skills will be those that involve the ability to mine overwhelming amounts of data, protect systems from security threats, manage the risks of growing complexity in new systems, and communicate how technology can increase productivity.
1. Analyzing Data
By 2020, the amount of data generated each year will reach 35 zettabytes, or about 35 million petabytes, according to market researcher IDC. That's enough data to fill a stack of DVDs reaching from the Earth to the moon and back, according to John Gantz, chief research officer at IDC.
Demand will be high for IT workers with the ability to not only analyze dizzying amounts of data, but also work with business units to define what data is needed and where to get it.
These hybrid business-technology employees will have IT expertise and an understanding of business processes and operations. "They are people who understand what information people need" and how that information translates into profitability, says David Foote, president and CEO of IT workforce research firm Foote Partners LLC. "You'll have many more people understanding the whole data 'supply chain,' from information to money," he says.
2. Understanding Risk
Risk management skills will remain in high demand through 2020, says futurist David Pearce Snyder, especially at a time when business wrestles with growing IT complexity. Think of IT problems on the scale of BP's efforts to stop the Gulf of Mexico oil spill, or Toyota's work to correct sudden acceleration in some of its cars, Snyder says.
"When you're in the time of rapid innovation," which is happening now and will continue into 2020, he contends, "you run into the law of unintended consequences -- when you try something brand-new in a complex world, you can be certain that it's going to produce unexpected consequences." Businesses will seek out IT workers with risk management skills to predict and react to these challenges.
3. Mastering Robotics
Robots will have taken over more jobs by 2020, according to Joseph Coates, a consulting futurist in Washington. IT workers specializing in robotics will see job opportunities in all markets, he adds.
"You can think of [robots] as humanlike devices, but you have to widen that to talk about anything that is automated," Coates says. Robotics jobs will involve research, maintenance and repair. Specialists will explore uses for the technology in vertical markets. For example, some roboticists might specialize in health care, developing equipment for use in rehabilitation facilities, while others might create devices for the handicapped or learning tools for children.
4. Securing Information
Since we're spending more and more time online, verifying users' identities and protecting privacy will be big challenges by 2020, because fewer interactions will be face-to-face, more personal information may be available online, and new technologies could make it easier to impersonate people, according to a report by PricewaterhouseCoopers. Teleworkers will also represent a larger portion of the workforce, opening up a slew of corporate security risks.
"We're in a dangerous place," because many employees are tech-savvy, yet they "don't understand the first thing about data security," Foote explains. "That will change in 2020," when companies will cast an even wider net over data security -- including the data center, Internet connectivity and remote access, he predicts.
5. Running the Network
Network systems and data communications management will remain a top priority in 2020, but as companies steer away from adding to the payroll, they will turn to consultants to tell them how to be more productive and efficient, says Snyder, who follows predictions from the U.S. Bureau of Labor Statistics.
"You have already cut as many people as you can, so now you can only increase productivity," he says. "Someone has to come in here and tell me how to better use the technology that I have."
- How to Discover a Neutron Star with Your Home Computer
- Three users in Iowa and Germany have discovered a new radio pulsar by using a distributed computing program called Einstein@Home, which analyzes astronomical data while a computer is idle.
When the Colvins, a couple from Iowa, received several e-mails about how their computer had made a major scientific discovery, they deleted them.
"It turns out that they thought the e-mails were spam," Bruce Allen, director of Germany's Max Planck Institute for Gravitational Physics, told MSNBC. But when the Colvins were sent a registered letter by FedEx, they realized the e-mails were, in fact, authentic. On June 11, the home computer of the husband and wife, who are both IT professionals, discovered a radio pulsar 17,000 light years away.
The Colvins had installed a program called Einstein@Home onto their personal computer. Einstein@Home uses home computers to process data from two major astronomical sensors, the Laser Interferometer Gravitational Wave Observatory (LIGO) in the United States and GEO 600 in Germany. These two detectors search the sky for pulsars and gravitational waves, which Einstein predicted would indicate the presence of exploding stars, black holes, and other violent events.
The Einstein@Home screensaver processes data from major astronomical detectors (source: Einstein@Home).
Because a vast amount of data is collected by the detectors, LIGO created Einstein@Home to help process the information. Computer owners can download the program, which receives data from a central server, onto their home machines. When computers are idling, the software processes data from LIGO, returns it to the server, and receives more to analyze. Einstein@Home, which is currently being used by nearly 300,000 computer owners worldwide, only downloads two megabytes of astronomical data at a time, so it does not affect a machine’s performance.
On Aug. 12, the journal Science announced the discovery of the pulsar by Einstein@Home. This marks the first finding the program has made, but it will not likely be the last.
"It was a bit like winning the lottery," Helen Colvin told Nature. "The odds aren't in your favor.”
A pulsar is a neutron star that spins rapidly and regularly emits a beam of electromagnetic radiation. When the first pulsar was discovered in 1967, some scientists thought it was evidence of extraterrestrial life, since its radiation was so unnaturally regular. It was even named LGM-1, standing for “little green men.” Soon, however, scientists realized the origins of pulsars, which have since played a key role in theories like gravitational radiation, general relativity, and the existence of other planetary systems.
Pulsars regularly emit beams of electromagnetic radiation (source: NASA).
The pulsar discovered by Einstein@Home, now named PSR J2007+2722, was first observed three years ago by the Arecibo Observatory in Puerto Rico. Three days after the Colvins’ computer discovered the star, the finding was confirmed by the computer of Daniel Gebhardt in Mainz, Germany.
"This is a thrilling moment for Einstein@Home and our volunteers," Allen said in a news release. "It proves that public participation can discover new things in our universe. I hope it inspires more people to join us to help find other secrets hidden in the data."
The discovery brings hope to other distributed computing efforts. SETI@Home, for example, uses home computers to search for extraterrestrial life.
By installing Einstein@Home on your personal computer, you too could be the discoverer of an unusual astronomical body. Find out more here.
Sprint offering free femtocells
Sprint offering femtocells to customers in poor-coverage areas
It's an age-old truth: If you want people to adopt a new technology, give it to them for free.
While femtocells still haven't caught on like many manufacturers had hoped, Sprint is now seemingly opening the door to far wider adoption by offering users living in areas with poor coverage free femtocells.
According to a report in Fierce Wireless, Sprint will be reviewing coverage quality in customer locations on a case-by-case basis to evaluate whether they will qualify for a free Airvana 3G EV-DO Rev. A femtocell. Users who receive free femtocells from Sprint will have to return them to the company if they cancel their service, Sprint spokesman Mark Elliott told Fierce Wireless.
Femtocells are essentially small cellular access points that route nearby wireless voice and data traffic through preexisting broadband connections. In this way, femtocells can provide VoIP for wireless handsets that can both improve call quality and save money by letting users make calls without using up their cell minutes. Femtocells also have benefits for carriers, as they let wireless companies offload traffic from their own networks and onto wired IP networks.
So far, however, carriers haven't been all that successful in selling femtocells to their customers. A recent report from research firm Infonetics Research found that vendors sold approximately 17,000 femtocells in 2009, well below their expectations. Infonetics analyst Richard Webb said at the time that carriers have tried marketing femtocells to their customers by focusing on their ability to improve call quality within homes. The problem with this, Webb claimed, is that most people have strong call quality in their homes already and don't see the need to spend more than $100 on equipment to improve it.
Woojune Kim, Airvana's vice president of technology, told Network World this past year that he expected it was only a matter of time before carriers figured out the proper pricing models that would lead to widespread adoption. Apparently, the best pricing model for many customers is "free," as femtocells are apparently providing enough of a boost to carriers that it makes sense for them to give them away to customers who need them. Studies from carriers such as Japan's NTT DoCoMo have shown that 70% of mobile traffic comes from users located in buildings, so if carriers can offload the majority of their indoor mobile traffic through femtocells it will leave them considerably more bandwidth to serve customers consuming voice and data services outdoors.
Monday, August 23, 2010
Technology's Biggest Myths
We put myths to the test to find the truth behind tech's tallest tales.
As it turns out, Windows Vista really wasn't all that slow; and no, your PC probably won't fry if you open it up without wearing a wrist strap. Thanks in large part to the Internet, the tech world is teeming with lies, half-truths, and misinformation. We've dug up some of the Web's most notorious nuggets of conventional wisdom to see which hold up to scrutiny and which are merely urban legends.
Of course, there's often a grain of truth in even the most fanciful myth. That's why we provide a handy-dandy set of numbered warning signs to indicate how accurate each of these myths is, with 1 being True and 4 being Outrageous--a complete fabrication. After all, they say numbers never lie.
The Claim: Vista Is Slower Than Windows 7
When Windows Vista came out, it soon acquired a reputation for being slow and a resource hog. Once Windows 7 arrived, people were quick to tout it as the speedy, slim operating system that Vista should have been.
We conducted performance tests on a handful of laptops and desktops using both 32-bit and 64-bit versions of Vista and Windows 7, shortly after the latter OS was released. While results varied across configurations, a few trends stood out. Windows 7 raised WorldBench 6 scores from 1.25 percent to almost 10 percent (but most often in the vicinity of 2 to 3 percent); it also resulted in much faster disk operations (in Windows 7 our Nero disc-burning software tests ran twice as fast on an IdeaPad laptop, and 2.5 times as fast on a Gateway laptop), and in slightly longer battery life (the IdeaPad lasted only an extra minute; the Gateway got an extra 15 minutes).
While Windows 7 did seem to speed things up somewhat, a few tests actually showed some slowdown. Applications launched more slowly across the board (the most dramatic change was a 2.7-second Photoshop CS4 launch in Vista turning into a 9.6-second launch in Windows 7), and the Gateway laptop saw a slight increase in startup time (39.6 seconds in Vista; 43.6 seconds in Windows 7).
As it turns out, the "snappy" feeling Windows 7 engenders has to do with Registry tweaks and minor changes to the window manager that make the OS feel more responsive, even though it isn't that different.
The verdict: Windows 7 is faster, but not by as much as you may think.
(Warning: 2, Mostly True)
The Claim: All Smartphones Suffer Signal Loss From a 'Grip of Death'
When early iPhone 4 adopters discovered that touching a certain spot on the exposed antenna could cause the phone to lose signal strength, reduce data speeds, and even drop calls, Apple insisted that all smartphones suffered from a similar defect.
We tested that claim with five different smartphones. We looked at RF signal strength, data speed rates, and call quality in areas with weak and strong signals.
While every phone we tested was affected by a "grip of death," none went so far as to drop calls, as the iPhone 4 did. Bottom line: If you don't have an iPhone 4, you don't need to worry too much about this antenna issue.
(Warning: 2, Mostly True)
The Claim: The Desktop PC Is Dying
Sure, laptops are cheaper and more powerful than ever, and can meet all your basic computing needs. But saying that the desktop is on its deathbed is like saying that, since all most people need is a Geo Metro, the pickup truck is obsolete. Power users who need desktop-caliber performance in a laptop must pay a significant premium, and if they want a Blu-ray drive, a better GPU, or a 3D display, they must buy a new model. Also, people who like to tinker with their PCs have fewer options with laptops than they do with desktops.
Meanwhile, the desktop PC market is evolving to meet users' demands. People who want a larger display but don't like the looks of a tower can buy an all-in-one system. Others want a computer that fits nicely next to their 50-inch HDTV--a home theater PC. And students, who typically benefit most from a laptop, can buy both a solid all-in-one PC for gaming and movies (ahem--"multimedia projects") and a cheap, lightweight netbook for taking notes in class for the same price as a single moderately powerful laptop (which would be more expensive to replace if it were broken, lost, or stolen).
(Warning: 3, Dubious)
The Claim: High-Priced HDMI Cables Make Your HDTV Look Better
When you plunk down $1200 (or more) for a new HDTV and $300 for a Blu-ray player, it can be easy for a salesperson to guilt you into tacking a $150 HDMI cable onto your purchase--after all, your brand-new gear needs a good cable to get the image quality you're paying for, right? If you're lucky, you'll have the alternative of buying the "cheap" store-brand cable, at a cost of only $30 and a disapproving look from the cashier. Well, feel free to take that $150 and spend it on popcorn for the movies you'll be watching--your HDTV won't care which HDMI cable you use.
High-quality cables have been a staple of the audio/video business for decades now, and for good reason: As an analog audio or video signal travels from one device to another, it's susceptible to interference and disruption, meaning that the image data as it leaves your DVD player isn't 100 percent identical to the image that shows up on your TV, because certain parts of the signal can get lost on the way there.
However, digital audio/video standards like DisplayPort, DVI, and HDMI don't have this problem because the data being transmitted over the cable isn't as sensitive as an analog signal; it consists entirely of ones and zeros, and a tremendous drop in signal voltage has to occur before a one starts to look like a zero at the receiving end. When this does happen, you'll usually see some kind of white static "sparklies" on your TV, as the set attempts to fill in the blanks itself, but this typically happens only over very long HDMI runs (8 meters and up). For shorter cables, the cable quality shouldn't matter.
That explanation rarely succeeds in silencing the home-theater enthusiasts (and home-theater salespeople) who swear that they see a difference between the good stuff and the cheap stuff, so we decided to check them out ourselves to see whether cost made a difference. We tested two pricey HDMI cables--the Monster HD1000 ($150) and the AudioQuest Forest ($60)--against a couple of bargain-basement cables from Blue Jeans Cable (the 5001A-G, $5) and Monoprice (the 28AWG, $3.04).
After testing different kinds of high-def video clips (including clips of football broadcasts and selections from The Dark Knight on Blu-ray), we ended up with all four cables in a dead heat: Blue Jeans Cable, Monoprice, and Monster all saw an average rating of 3.5 out of 5, with AudioQuest trailing ever so slightly at 3.4--close enough to practically be a rounding error. So save your money and stick to the cheaper cables unless you need the cables to cover a long distance.
(Warning: 4, Outrageous)
The Claim: LCDs Are Better Than Plasma Screens for HDTV Sets
Don't believe the hype: Your local HDTV salespeople may be trying to upsell you on a spiffy new LCD, but there are plenty of reasons to pick a plasma instead. Plasmas still handle darker scenes better, have a wider range of viewing angles, and are generally cheaper than LCDs (especially at larger sizes). Panasonic and Samsung continue to manufacture plenty of plasma sets (including a line of home 3D TVs and a gigantic, superexpensive 152-inch 3D display). You can read more about plasma vs. LCD displays.
LCDs are catching up in a few respects, however. LCD sets with LED backlighting and higher refresh rates can compensate for some of the traditional problems of LCDs, and they suck up significantly less power than plasma sets do, so the higher price may be offset over time in your electricity bill.
Despite the remaining advantages of plasma, it's worth noting that some manufacturers are dropping out of the plasma display market (Pioneer, most notably, and Vizio), and California plans to ban power-hungry TVs--so the writing is undeniably on the wall: Plasma isn't dead yet, but it may be finished in a few years.
(We have more on HDTV myths here.)
(Warning: 3, Dubious)
The Claim: More Bars on Your Cell Phone Means Better Service
The signal bars on your cell phone display indicate the strength of your cellular signal to the nearest tower. But if you're connected to a tower that lots of other people are connected to, you could have a strong signal and still have poor service, since everyone's calls are competing for scarce network resources. Once your information arrives at the cellular tower from your phone, it has to travel through your service provider's backhaul network (which connects the tower to the Internet). And if your provider's network isn't up to snuff, you could have a flawless connection to an empty cell tower, and yet still encounter poor speeds and dropped calls.
When we tested 3G service in 2009, we found that signal bars were poor indicators of service quality in 12 of the 13 cities in which we tested. In San Francisco, for one, signal bars correlated with service quality in only 13 percent of test results. Additionally, if you use an iPhone, you might just be seeing inaccurate readings. Apple recently announced (in connection with the iPhone 4 antenna issue) that the formula it had been using in all iPhones to display signal strength was "totally wrong" and often reported the signal as two bars higher than it should have. Oops.
(Warning: 3, Dubious)
The Claim: Over Time, Inkjet Printers Are Much More Expensive Than Laser Printers
To figure out how much a printer's consumables will cost you over time, you take the price of the ink or toner cartridge and divide by the estimated page yield per cartridge, for your cost per page. Traditionally, laser printers have had a higher initial purchase price, which was balanced by their lower cost per page versus inkjet printers.
However, as inkjet printer manufacturers began to release more efficient models (ones with separate ink tanks for each color, or higher-yield cartridge options), the cost-per-page gap has closed dramatically. Businesses needing cheap, fast printers, for example, could do well with either the Epson B-510DN inkjet (1.3 cents per black text page, 14.7 pages per minute, $600 retail price), or one of the more economical laser printer models, such as the Oki C610dtn (1.1 cents per black text page, 19.1 pages per minute, $700 retail price). Home users and students have fewer options--paying less for the printer means paying more for the ink. To its credit, the Canon Pixma iP4700 (2.7 cents per black text page, 7.4 pages per minute, $100 retail) has reasonably priced inks.
Keep in mind that the inkjet printers you see going cheap with big mail-in rebates or included with laptop purchases generally aren't the type that can hang with a laser printer in speed and costs. Instead, you'll end up paying more in the long run via expensive, low-yield ink cartridges--to the point where it can even be cheaper to buy a new printer than to refill the ink in your old one.
(Warning: 4, Outrageous)
The Claim: People With More Monitor Space Are More Productive
Begging your boss for an extra display at work? You might sell her on the idea if you tell her that you'd be 30 to 50 percent more productive than you are on your single 18-inch display. At least, that's what a 2008 study from the University of Utah (commissioned by NEC, mind you) found for text and spreadsheet tasks.
NEC, naturally, was quick to trumpet the results as a way to move more of its widescreen displays. However, the study also found a point of diminishing returns. Productivity gains fall in a bell-curve distribution once you hit a certain amount of screen space. For a single-monitor setup, over 26 inches is too much, while dual-display gains top out at 22 inches.
In addition, the pattern of the results implies that while a second monitor can make you a wunderkind at work, don't even think about adding a third. Interestingly, users' reported preference did not predict their performance-that is, the setup they liked wasn't necessarily the one they worked best with.
So think about what you'd be using that second display for. The University of Utah study took place in a controlled environment, where the subjects did nothing but the text and spreadsheet tasks they were assigned. If that sounds like your office, you'll probably do great with a second monitor.
If you're planning on using that second display for e-mail, Twitter, or other Internet-related distractions, however, you're probably going to end up being less productive overall. (I certainly am.)
(Warning: 2, Mostly True)
The Claim: Refilled Ink Cartridges Will Ruin Your Printer
Taking your printer's ink cartridge to a refill service can save you a few bucks. But because cartridges aren't designed to be reused, refilling has risks: Nozzles could clog, or the ink tank could spring a leak. A good rule of thumb is to monitor the cartridge closely so you can prevent damage to it--or to your printer--if something goes awry. That way, though the cartridge or printhead might be a goner, you are unlikely to cause any permanent damage to the printer itself, unless the cartridge leaked and you didn't clean it up.
Note that refills done by a third party typically come with a guarantee that covers the cartridge (which may cost anywhere from $10 to $20)--but not necessarily the printer. The Cartridge World ink refill chain, for example, guarantees to repair a faulty cartridge or credit the cost against a new cartridge, but if your printer bites the bullet, the company can only "provide advice or a qualified service technician to address any issues."
Refill companies also like to remind you that it is illegal for your printer manufacturer to void the warranty on your printer for using third-party cartridges. True enough, but warranty agreements we've seen suggest that if a refill cartridge breaks your printer, you shouldn't expect a free fix. For example, the HP warranty agreement explicitly states:
"For HP printer products, the use of a non-HP ink cartridge or a refilled ink cartridge does not affect either the warranty to the customer or any HP support contract with the customer. However, if printer failure or damage is attributable to the use of a non-HP or refilled ink cartridge, HP will charge its standard time and materials charges to service the printer for the particular failure or damage."
If you're worried about leaks, pull the cartridge out of the printer occasionally to see if any excess ink is pooling near where the cartridge rests in the printer.
(Read more on people who do their own refilling.)
(Warning: 2, Mostly True)
The Claim: Internet Explorer Is Less Secure Than Other Browsers
Everyone "knows" that Chrome, Firefox, and Safari are all way more secure than Internet Explorer. But what's the real story?
To find out, I first looked up Symantec's twice-yearly Internet Security Threat Report, which yielded the total numbers of reported vulnerabilities for 2009: Firefox had the most at 169, followed by 94 for Safari, 45 for IE, and 41 for Google Chrome. For more-recent data, I turned to the United States Computer Emergency Readiness Team (US-CERT), which hosts the National Vulnerability Database, a searchable index of reported computer vulnerabilities. A search of data for a recent three-month period yielded 51 such vulnerabilities for Safari (including both mobile and desktop versions), 40 for Chrome, 20 for Firefox, and 17 for IE.
Such counts alone aren't the best way to measure a browser's security, however. A browser with 100 security flaws that are patched a day after being discovered is safer than a browser with only one exploit that hasn't been patched for months.
According to Symantec's report, the average window of vulnerability (the time between when the flaw is reported and when it's patched) in 2009 was less than a day for IE and Firefox, 2 days for Google Chrome, and a whopping 13 days for Safari. Clearly, Internet Explorer is doing fairly well. Nevertheless, you should still consider a few important factors before deciding to jump ship back to IE.
Stay updated. The second most common Web-based attack in 2009 exploited an IE security flaw patched way back in 2004 (the 2009 attack targeted unupdated PCs). The latest version of IE 8 may be pretty safe, but ditch any earlier version you have.
Your browser is only as secure as your plug-ins. Symantec found that Microsoft's ActiveX plug-in (enabled by default in IE) was the least secure with 134 vulnerabilities, followed by Java SE with 84, Adobe Reader with 49, Apple QuickTime with 27, and Adobe Flash Player with 23. The moral: Be careful at sites that use browser plug-ins.
It's tough to be on top. IE still has the biggest piece of the browser pie, meaning that cybercriminals are more likely to target IE than other browsers.
(Warning: 4, Outrageous)
The Claim: You're Safe If You Visit Only G-Rated Sites
If your PC has ever had a virus, you probably know about the raised-eyebrow, mock-judgmental looks you get when you tell that to other people. After all, if you had been a good little PC user and stayed in the G-rated Web, you would have been safe, right?
Not so, says Avast Software, makers of Avast, a popular antivirus suite. "For every infected adult domain we identify, there are 99 others with perfectly legitimate content that are also infected," its chief technology officer, Ondrej Vlcek, reports. In the United Kingdom, for example, users are far more likely to see infected domains with London in the name than sex.
So porn alone doesn't necessarily mean you're opening yourself up for infection. Which makes sense--porn-site operators depend on subscriptions and repeat visitors to do business, and infecting your customers with spyware isn't the best way to do it.
If you find yourself on a generic-looking Website with popular search keywords in the title, or a site that's rearranging your browser window, you're likely to end up stuck with some malware--whether it's about porn or about hotels in London.
(Warning: 4, Outrageous)
The Claim: You Should Regularly Defragment Your Hard Drive
Your hard drive has to decide where to write your files on the drive platter, and as you fill up the drive, your files will be scattered more and more widely across the platter. This means that the drive's read/ write heads take longer to find the whole file, since they take more time skipping around the platter to find the different parts of the fragmented file. However, this state of affairs isn't an issue these days, for several reasons:
Hard drives are bigger. When your hard drive capacity was measured in megabytes, fragmentation was a big deal. Not only did the drive's read/write heads have to move all over the platter, but the space freed up by deleting old files was also scattered, and new files could be dispersed across the small gaps between larger files.
People now generally have more hard drive space and use a smaller overall percentage of their drive, so the read/write heads don't have to move as much.
More RAM and optimized OSs help. Newer iterations of Windows have done a lot to reduce the impact that a fragmented hard drive can have on a PC's performance. According to the engineers who worked on Windows 7's updated Disk Defragmenter tool (see the screenshot above), Windows' file system allocation strategies, its caching and prefetching algorithms, and today's relative abundance of RAM (which permits the PC to cache the data actively in use rather than having to write repeatedly to the drive) minimizes fragmentation delay.
Solid-state drives don't need to be defragmented. SSDs don't have a drive platter or read/write heads that need to go searching around the drive. In fact, defragmenting is generally not recommended for SSDs because it wears down the hard drive's data cells, shortening the drive's overall lifespan.
You don't need to go out of your way to defrag. In Windows Vista and Windows 7, the system automatically handles defragging. By default, defragging happens at 1:00 a.m. every Wednesday, but if your PC isn't on or is in use, the process will occur in the background the next time the machine is idle. It will stop and start automatically, too, so don't worry about interrupting it.
We didn't notice a difference. When we last tested disk defragmentation, we took a heavily used, never-defragmented system from the PCWorld Labs, ran speed tests before and after defragging, and found no significant difference.
(Warning: 4, Outrageous)
You Probably Know This, But...
...overclocking your PC's processor won't make your computer blow up. Overclocking can generate excess heat, which may cause erratic PC performance and, over time, burn out certain components. But even in the worst-case scenario, your system will shut down before it blows up. Newer Intel and AMD processors automatically overclock and underclock themselves, depending on how busy your PC is, to keep things cool.
. ..your cell phone isn't going to cause an airplane to crash, though the Federal Aviation Administration still has a ban on using cell phones during flight to avoid interfering with the plane's navigation and communication systems. In fact, the Federal Communications Commission instituted its own ban in 2007 for a different reason: When we're on the ground, our cell phones automatically locate the closest cell tower, but when we're 30,000 feet in the air, we're roughly the same distance from several different towers at once, meaning that multiple towers might sense our call and reserve that cellular channel for us--which could prevent other people from using the tower and interfere with existing calls.
...you don't have to worry about magnets annihilating your hard drive. Magnets were dangerous for 3.5-inch floppy disks, but modern hard drives aren't affected by anything short of a high-end degaussing device. Don't worry about your flash memory cards and solid-state drives, either--there's nothing magnetic about flash memory, so such devices won't be affected.
...you shouldn't run your laptop battery to zero. Users occasionally had to drain the nickel-metal hydride (NiMH) batteries in older laptops because the batteries would incorrectly "remember" how much charge they could hold if the user wasn't charging a battery to its full capacity. The lithium ion batteries in modern laptops, however, can actually lose maximum battery charge if they are completely drained, because doing so increases the battery's chemical resistance to recharging, which shortens its lifespan. The only time that you should consider running your lithium ion battery to zero is if your PC's battery life ratings have gone completely haywire. Draining the battery can sometimes fix this problem.
Don't Be Fooled Again
All fired up about demystifying tech-related myths? A few other sites can help.
Snopes.com is good for tracking annoying chain letters and the occasional Facebook-related scare. If friends and family pester you about such things, sending them a few links to Snopes might help.
HowStuffWorks.com has a special "tech myths" section that deals specifically with some of the more popular misconceptions in the tech world.
PCWorld Forums is also worth visiting. It's one of the best venues where hard-core PC users congregate to swap stories and advice. Ask away and get plenty of answers.
Trojan blamed for Spanish air crash
Critical safety system not working, says report
A plane crash that killed 154 people in 2008 might have been partly connected to the infection of an important ground safety system by malware, a Spanish newspaper has claimed.
The Spanair plane took off from Madrid to fly to the Canary Islands on 20 August 2008, but failed to clear the runway. Of the 172 passengers and aircrew on board, only 18 survived.
The precise cause of the crash remains contentious but was believed by investigators to relate to the MD-82 not having its flaps set to the correct position prior to takeoff.
Given that investigators believe that the pilots twice failed to spot that the flaps were set in the incorrect position for take-off, a ground system used by the airline should have spotted the error and sounded the alarm.
According to the newspaper El Pais, on the day of the crash this system was not functioning due to unnamed infection by computer Trojans.
If the analysis is confirmed, it will be the first known example of malware being directly connected to fatalities. Equally, it could be pointed out that if a critical safety check system is inoperable human intervention should have been required to perform that function.
Reported malware infection of critical systems is still officially a rare event. In 2008, the International Space Station was hit by a computer worm that infected laptops onboard the orbiting mission after being brought on board by one of the Russian crew.
Thursday, August 19, 2010
Five billionth device about to plug into Internet
Sometime this month, the 5 billionth device will plug into the Internet. And in 10 years, that number will grow by more than a factor of four, according to IMS Research, which tracks the installed base of equipment that can access the Internet.
On the surface, this second tidal wave of growth will be driven by cell phones and new classes of consumer electronics, according to an IMS statement. But an even bigger driver will be largely invisible: machine-to-machine communications in various kinds of smart grids for energy management, surveillance and public safety, traffic and parking control, and sensor networks.
Earlier this year, Cisco forecast equally steep growth rates in personal devices and overall Internet traffic. [See "Global IP traffic to increase fivefold by 2013, Cisco predicts"]
Today, there are over 1 billion computers that regularly connect to the Internet. That class of devices, including PCs and laptops and their associated networking gear, continues to grow.
But cellular devices, such as Internet-connected smartphones, have outstripped that total and are growing at a much faster rate. Then add in tablets, eBook readers, Internet TVs, cameras, digital picture frames, and a host of other networked consumer electronics devices, and the IMS forecast of 22 billion Internet devices by 2010 doesn’t seem farfetched.
The research firm projects that in 10 years, there will be 6 billion cell phones, most of them with Internet connectivity. An estimated 2.5 billion televisions today will largely be replaced by TV sets that are Internet capable, either directly or through a set-top box. More and more of the world’s one billion automobiles will be replaced by newer models with integrated Internet access.
Yet, the greatest growth potential is in machine-to-machine, according to IMS President Ian Weightman. Research firm Gartner named machine-to-machine communications one of the top 10 mobile technologies to watch in 2010. And almost exactly one year ago, Qualcomm and Verizon created a joint-venture company specifically to support machine-to-machine wireless services.
"This has the potential to go way beyond industrial applications to encompass [such applications as] increasingly sophisticated smart grids, networked security cameras and sensors, connected home appliances and HVAC equipment, and ITS infrastructure for traffic and parking management," Weightman said in a statement.
- Robot to Uncover Pyramid of Giza's Secrets
- Scientists are preparing to send a robot into a mysterious shaft in the Great Pyramid of Giza. The robot could uncover artifacts, mummies and clues to the pyramid's baffling construction.
Built over 4,500 years ago, the Great Pyramid of Giza is the oldest wonder of the ancient world. Constructed under the reign of the Pharaoh Khufu, the pyramid remains one of the most mysterious and celebrated structures of the ancient world. One debated aspect of the pyramid is its construction. Some scholars say that slaves were used, others argue for paid workers; some say that the huge stones were carried from quarries, while others insist that ramps were used. Some people even theorize that extra-terrestrial aliens helped build the pyramid.
The Great Pyramid of Giza was the tallest manmade structure for over 3,000 years (source: Nina Aldin Thune).
In 1872, the Great Pyramid of Giza became even more mysterious when Waynman Dixon, a British engineer, discovered two shafts in one of the pyramid's sections, called the Queen's Chamber. While similar passageways had been discovered in the King's Chamber, the exits of those shafts had been found on the exterior of the pyramid, and scholars assumed they were egresses for the soul. The Queen's Chamber shafts, on the other hand, had no apparent exits. Their purpose and contents were unclear.
Over a century later, in 1992, scientists set out to explore the shafts using a small robot. The mystery deepened when the robot encountered a limestone door with two copper handles. No doors had been found in the King's Chamber shafts, so the function of this door was unknown to scholars. It was not until 2002 that another robot was sent into the shaft to drill through this first door. The robot's camera revealed a remarkable discovery: another door. Unlike the first door, the second door has many cracks and, according to Zahi Hawas, a well-known Egyptologist, it seems to be screening or covering something. In a blog post, Hawas proclaimed, "The mystery of the doors is one of the most exciting puzzles in Egyptology today."
That puzzle may soon be solved. Sponsored by Leeds University, a team of researchers, engineers and archaeologists has created a high-tech robot to explore the shafts in the Queen's Chamber. The robot has been nicknamed Djedi, after the magician who helped Pharaoh Khufu create the pyramid's plans. Djedi is equipped with the tools to explore the shafts while creating minimal amount of damage.
The Djedi team discusses their findings with Egyptologist Zahi Hawas (source: Meghan Strong).
The robot has a small coring drill that will be able to bore a small hole into the second door. Djedi is outfitted with two means of exploring beyond this hole. First, it has an endoscope-like "snake camera" that is capable of fitting through small spaces and seeing around corners. Djedi also has a miniature robot, called a "beetle," that can navigate holes as small as 20mm to further explore confined spaces. Djedi has several observational tools, including a tiny ultrasound device to determine the thickness and condition of stone, a precision compass, and an inclinometer to record the orientation of the shafts.
The Djedi robot has been specially designed to navigate the narrow passages of the shaft and drill through the mysterious limestone door (source: Sandro Vannini).
What could be hiding behind the door? While artifacts of Egyptian life and religion are likely, Hawas has proposed a more sensational find. "Could it be possible," he writes, "that these doors are evidence that Khufu's burial chamber might still be hidden somewhere inside his pyramid?" Since Khufu's mummy has never been found, its discovery could be one of the most thrilling archaeological finds in modern history.
August 19, 2010
THE ONLY SAFE WAY TO BOAST IS BY CONSTRUCTIVE ACTIONS.
It has been said that it’s not boasting if you can really do it. This may be true, but a far more persuasive argument is made when you do it first and talk about it later. Besides, good things that are said about you always carry more weight when they are said by someone other than yourself. When you find yourself tempted to wax eloquent about your achievements, force yourself to pause for a moment, take a deep breath, and ask someone else about their achievements.
This positive message is brought to you by the Napoleon Hill Foundation. Visit us at http://www.naphill.org. We encourage you to forward this to friends and family. They can sign up for this free service at our web site.
Electronic device helps Roth achieve another milestone: he goes bowling08/19/10
2009-10 PBA Spare Shots #21
Since Roth, 59, suffered a massive stroke in late May 2009 that left the left side of his body paralyzed, he has refused to give up the fight to regain his life. His first public appearance following his stroke was at the GEICO Mark Roth Plastic Ball Championship in late March in West Babylon, N.Y., motivated him to continue his rehabilitation.
He followed that appearance with a trip to Columbus, Ohio, in April where he spent a week with his former PBA Senior Tour competitors.
Last week, with the assistance of a recently-developed device called a "WalkAide" that provides electronic stimulation to eliminate a common stroke condition called "drop foot," the 34-time PBA Tour champion made another milestone leap forward.
With the WalkAide, he was able to lift his left foot almost normally and walk "without tripping over his toes," Denise Roth said. And with the ability to stand and walk on his own, Roth decided to test the device on a bowling lane in Fulton, N.Y., where he quickly worked his way up from a 6-pound ball to a 12-pounder, Denise said.
"Mark had use of the device for a seven-day trial and it was amazing," Denise said. "He could walk faster and farther than any time since his stroke. He actually bowled with confidence. He was getting around 100 percent better, which helped him get some badly-needed exercise.
"He had to turn the WalkAide back in after the trial period, so now we have to wait to see what the insurance company says (about getting it back)," she added. "It's a wonderful device. It actually took some of his hip pain away, too."
Tuesday, August 17, 2010
Monday, August 16, 2010
NetSuite, Zoho Joint Venture Develops Online Autism Tracker
By Brian T. Horowitz
Autism experiences led the wives of NetSuite and Zoho CEOs to create MedicalMine and its ChARM cloud application, which helps parents manage the condition. A version for physicians is also in the works.
While the joint venture between the two companies is bringing them into the health care IT field, it’s also a personal situation for the leaders of MedicalMine, which is based in Palo Alto, Calif.
Autism affects the son of MedicalMine CEO Pramila Srinivasan and her husband, Zoho CEO Sridhar Vembu. In addition, the daughter of Elizabeth Horn, MedicalMine's executive vice president of marketing and business development, and her husband, NetSuite CEO Zach Nelson, has also been diagnosed with the condition.
The executives came together to create ChARM (Children's Autism Recovery Map), a Web application to track the care of children suffering from autism and other chronic diseases.
"We wanted to help make parents aware of ChARM and the help it can provide to caregivers of children with autism," NetSuite's Nelson said in a statement.
"The goal is to understand the real problems faced by parents and physicians in everyday life," Srinivasan told eWEEK. By creating ChARM, MedicalMine aims to document the daily challenges of autism patients all in one place, Srinivasan said.
ChARM runs on Zoho's SAAS (software-as-a-service) application platform. The software will also use some of the same services as Zoho products, such as Web e-mail and messaging. Meanwhile, NetSuite provides the marketing support for MedicalMine.
In October 2010, MedicalMine will release an electronic medical record, or EMR, version called ChARM Physician, which will allow doctors to input their medical records into the system and share them with patients. Physicians will also be able to store a child's handwriting as well as photographs of rashes and bruises. In addition, they can upload video of seizures and log the occurrences. The product will also support third-party prescription handling.
For 2011, MedicalMine plans a third product called ChARM Research. In advance of the release, doctors are defining what types of data mining they're interested in, said Srinivasan. Medical researchers will be able to understand the workflow of a doctor's office, including the physical exam, details of encounter and objective evaluation, said Srinivasan.
"Researchers will have the ability to connect with the patients or their caregivers and get a comprehensive picture of what the child is going through on a daily basis," said Srinivasan. “We have all this information coming through the ChARMTracker portal, where the children are eating, where they're going to school. All of this will give the full picture to the researcher and clinicians."
Although ChARM began as a way to monitor autism, it's evolved into a way to track other types of conditions as well, such as diabetes and chronic fatigue.
"The goal is to capture more than just autism," she said. The product will also meet the needs of radiologists, immunologists, endocrinologists. "It will probably lose its origin a little bit," said Srinivasan.
"Technology is a crucial piece to help us solve the puzzle of autism," Vembu wrote in a statement. "Without a systematic collection of data about these children, from genomics to nutrition to environmental triggers, we will not find the answers to the unanswered questions about causation and prevention."
ChARMTracker was introduced at the Autism One Conference in Chicago on May 24, 2009.
"Research for autism is going to be explosive in magnitude in the future," Srinivasan said. "It's good for high tech to be deployed in this process to accelerate the growth of research and make it possible for children to have good standard of care and treatment outcomes."