Tuesday, September 30, 2008

Future of BDPA


Seems like you turn on the news and all you hear is everything is in turmoil; the economy, Wall Street, mortgage collapse, more job layoffs, and just a general doom and gloom everywhere. In Michigan , this is not new to us since we have been a recession for many months and now the rest of America is feeling the pain now. With all this said, BDPA Detroit was not immune to the effects of the world. We are at our lowest point of participation in the 8 years I have been involved with the organization.
I like many IT workers was hit by the layoff bug and was out of work for over 14 months and during that time period had major heart surgery to boot. I too had to switch my priorities but kept a level of involvement in BDPA. Oddly, the position I am currently hold was in part made possible through networking with BDPA members in Detroit.
So BDPA does work if you work the system. BDPA can not guarantee you a job when you are laid off but it does help improve the odds of finding employment. But for BDPA to work, members must get involved and create a true network. We need to realize the definition of a network is a system to share resources. BDPA's resource are the members that share knowledge, time , effort and resources improve BDPA and the community.
I bring up this topic because the last TAC seminar on September 25 had only 4 participants; 3 BDPA members and 1 from NSBE on a topic which is in demand in the IT field, vitalization. BDPA is suppose to be an organization of "Thought Leaders" and a force of change in the IT community. But what does it take to draw people out??? With this being possibly the worst economic point in American history, you would think BDPA would be at it height of membership with professionals getting together and helping each other and the community but the opposite is true. I hope this is not a "Black people" thang of not trusting each other in times of need because we are afraid of getting used, abused and taken advantaged off. It is time to break that 20th century mind set and move into the 21st century.
We have all the resources necessary to make a difference in the lives of our families and possible the world. Just a quick scan of the internet shows how other groups are making a difference but not BDPA. Here are some solutions to this problem:
  1. Concerted effort to exploit technology to improve communication and promotion of BDPA events. (Planning a Windows Live meeting on using LinkeIn and other social networks)
  2. Encourage members to experiment with the Web 2.0 technologies out there. How many Twitter, Facebook, LinkedIn, Pownce and other sites have you played with on the web. Share your experience with other BDPA members and rate their usability.
  3. Encourage members to produce content on the web. I have noticed that the majority of content on the web was created by young white males. We need to place stuff on the web through Blogs, videos and audio podcasts. We need to fill the Google search engine with a variety of content produced by us and for all to learn and enjoy.
  4. Clarify the 21st century mission of BDPA. I still do not know BDPA's elevator pitch to recruit members. We need to become a relevant force in the IT community and our local community. I would like to see BDPA reach a point when companies beg us to donate money to our cause.
  5. Finally, have some fun. Let the true geek or what ever come out. We are professionals but if there is no passion, why do it.
Time for us to grow or die. It is up to you which direction BDPA takes. I hope the majority chooses growth and CHANGE to make our little time in this universe something meaningful and lasting.

Cliff Samuels Jr
BDPA TAC Director

Sunday, September 28, 2008

Blog Action Day next month


Blog Action Day 2008 Poverty from Blog Action Day on Vimeo.

Thursday, September 25, 2008

Digital piano adds Linux

From Linux Devices.com

Yamaha has added embedded Linux to the electronic "player piano" add-on it offers with some models. With version 3.0 of the Mark IV firmware, MontaVista Linux controls the 333MHz AMD Geode-based piano, enabling new interface possibilities, acoustic recording, and interaction with Web-based services.

The Disklavier is an add-on available for a variety of Yamaha piano models, including uprights and grands. A small controller box mounted underneath the keyboard runs a proprietary RTOS (real-time operating system) on a custom LSI (large-scale integration) processor. The RTOS gathers data from sensors claimed able to "continuously trace the hammer position from the time a key is pressed until it's released." By recording hammer and damper positions, the device is able to capture a live musical performance in a special musical data language not altogether unlike the perforated holes that once powered player pianos, except for having obviously much greater dynamic resolution.

Besides sensing hammer and damper positions, the RTOS is also able to control them, and thus can reproduce performances from the data files it captures. The system has an optical drive, and users can purchase classic performances by the greats, as well as recording and playing back CDs of their own performances.

With the Mark IV Disklavier, released in 2004, Yamaha added a second computer running the open source Linux operating system. The Linux system enables the addition of web-based and client/server remote control interfaces, freeing users from having to walk over to the piano to manually load and start playback from optical disks. It also adds acoustic recording, editing, and playback features, as well as integration with several online music service offerings.

The Mark IV Disklavier's Linux system is based on an AMD Geode clocked at 333MHz. It has 250MB of RAM, and an 80GB hard drive. I/O includes Ethernet, WiFi, dual USB ports, serial ports, audio I/O, video in, and stereo speakers. There's also a PCI expansion interface.


Disklavier Mark IV (mid-range DC3M4 model)
(Click to enlarge)

Thanks to the Linux system, Disklavier Mark IV users can interact with the system using three interfaces:
  • Disklavier Media Center (DMC) -- includes touchpanel display positioned to the upper left of the keyboard, serial-connected function buttons and LEDs, and a USB-linked CD/floppy drive

  • Pocket Remote Controller (PRC) -- WiFi-connected handheld controller based on a Sharp Zaurus running Linux and a Java UI, with 320 x 240 display and built-in QWERTY keyboard

  • Tablet Remote Controller (TRC) -- WiFi-connected tablet based on a Hitachi Tablet PC, running Linux OS and offering Flash playback
Higher level network and sequencer functions of the Disklavier are now controlled by a customized version of MontaVista Linux 3.1, using a Linux 2.4.20 kernel, said Yamaha's Taro Kawabata, Software Group, Piano Division, in an email interview. Additional open source software components include Samba, as well as a PostgreSQL database engine, he added.

"This was the first time we introduced Linux to our products," said Yamaha's Kawabata, adding, "We plan to use Linux in future systems."

Linux software adds recording and Internet radio downloads

The latest version 3.0 release of the Disklavier Mark IV firm, available as an online update, adds several interesting alternative music recording and playback capabilities. With the new firmware, users can capture multichannel audio recordings, and add effects such as room, stage, hall, and reverb. Song lyrics can be displayed on a connected TV monitor or on the TRC tablet controller. Users can transfer recordings in MP3 format to networked PCs or Macs, or burn an album directly onto a CD.

Other new features include:
  • Download MIDI files via WiFi and play them back on the piano's stereo speakers
  • Download MP3 music from a Yamaha site
  • Connect to Yamaha's DisklavierRadio Internet radio service
  • Playback of DisklavierRadio, and "immediate" purchase and download of live content

Audiophiles may care to know that the Disklavier's piano voice and performance tone are controlled by Yamaha's new Advanced Wave Memory 2 (AWM2) technology, which is said to incorporate its Articulation Element Modeling (AEM) tone generator. AWM2 offers 64-note AWM2 digital stereo sampling (90MB wave memory, 16 bit linear), as well as 6-note AEM polyphony, says Yamaha. The AWM2 technology is also said to provide for 32-note ensemble tone, which offers 16 ensemble parts, XG and GM voice modules modes, 676 "normal" voices, and 21 drum voice kits.

Stated Bill Brandom, Yamaha's Disklavier marketing manager, "The instrument is effectively blurring the line between home theater and live entertainment."

Availability

The version 3.0 upgrade is available now and comes standard with all new Mark IV Disklaviers sold, says Yamaha. Current Disklavier Mark IV owners can upgrade in exchange for filling out a survey, or purchase it for $90. More information may be available here.

Saturday, September 20, 2008

ConnecTech Meeting Tackles Disruptive Technologies

ConnecTech Meeting Tackles Disruptive Technologies
by Matt Roush

The disruptive technologies of tomorrow will change the way we live and work every bit as much as the Internet has the past 15 years.

Thursday night, nearly 200 people packed the lobby of the Fox Theatre in Detroit to hear from Gartner analyst Jackie Fenn and other local and national tech figures talk cloud computing, security, three-dimensional printing and more. The event was sponsored by ConnecTech, Automation Alley's organization for technology professionals.

Fenn kicked off the event with a 50-minute presentation on several disruptive technologies.

First, cloud computing: Fenn said it's an evolution of a trend ongoing for many years, the desire to not own IT assets but to get IT servies as needed from an outside source.

Also disruptive: social platforms and virtual worlds. Fenn siad virtual worlds like Second Life and Club Penguin and social networking sites like MySpace and Facebook represent "the most successful collaboration tools we've ever seen," becuase they've replaced e-mail among the young. She said they'd also influence future user interfaces for other applications. There are also other advances in the user interface for the first time in years, chiefly touch-screen tabletop computing.

She also said that other industries may become disintermediated from consumers, the way Craigslist has hurt local classified advertising. One possibility -- peer to peer lending sites, that may do the same to banks.

Fenn said we're headed for a "Real World Web" with the ability to get the information a user wants wherever a use needs it, which depends on devices that know identification, location, their owners, history, safety and envrionment.

There's also a trend toward "augmented reality," information overlaid on actual pictures -- for example, a cell phone that overlays directions or other information on a picture you're taking with the phone in real time. "You could point it at a movie theater and the phone would tell you what movies are playing there," Fenn said.

Users of all these technologies will also leave more and more data trails that could be used to track them, Fenn said.

Fenn also mentioned three-dimensional printing -- devices that bulild three-dimensional objects one thin layer at a time using powder and glue shot out by the same technology that powers a two-dimensional printer. It's been used in industrial prototyping for a decade, but is now coming down in price -- from $200,000 to $10,000. Eventually, Fenn said, "we'll all have these things in our basements" and our kids' kids will be trading three-dimensional representations of their online avatars.

The devices also have lots of implications for retailing -- why go to a store when you can build that replacement part yourself from CAD-CAM data?

Finally, Fenn mentioned human augmentation, technologies that don't just bring an injured person back to normal human function -- but which, like those artificial legs that look like leaf springs, put function over form and make a person improved over normal human function.

Fenn wrapped up by discussing how to pick new technologies, backing a process in her new book, "Mastering The Hype Cycle," called STREET -- for scope, track, rank, evaluate, evangelize and transfer.

The meeting also featured three other brief presentations. Uma Subramanian of IBM talked about Big Blue's view of cloud computing, saying that it consists of elastic scaling, rapid provisioning, advanced virtualization and elastic pricing -- and requires capacity, security and proper licensing.

Eric Eder, president of Royal Oak-based Intelligent Connections, outlined a couple of interesting concepts, including "tightly coupled processes," a series events that cannot be uncoupled once they begin, and which may lead to serious adverse events if the couplings are not known, and "semantic data," data that gets transferred to various locations to create cloud applications. As for security -- his firm's key business -- he said it's a matter of careful content management, ID management and 24-7 monitoring.

Finally, Bob Moesta of Troy-based Edutronix Inc. showed off his company's 3-D printers.

ConnecTech Meeting Tackles Disruptive Technologies

ConnecTech Meeting Tackles Disruptive Technologies

The disruptive technologies of tomorrow will change the way we live and work every bit as much as the Internet has the past 15 years.

Thursday night, nearly 200 people packed the lobby of the Fox Theatre in Detroit to hear from Gartner analyst Jackie Fenn and other local and national tech figures talk cloud computing, security, three-dimensional printing and more. The event was sponsored by ConnecTech, Automation Alley's organization for technology professionals.

Fenn kicked off the event with a 50-minute presentation on several disruptive technologies.

First, cloud computing: Fenn said it's an evolution of a trend ongoing for many years, the desire to not own IT assets but to get IT servies as needed from an outside source.

Also disruptive: social platforms and virtual worlds. Fenn siad virtual worlds like Second Life and Club Penguin and social networking sites like MySpace and Facebook represent "the most successful collaboration tools we've ever seen," becuase they've replaced e-mail among the young. She said they'd also influence future user interfaces for other applications. There are also other advances in the user interface for the first time in years, chiefly touch-screen tabletop computing.

She also said that other industries may become disintermediated from consumers, the way Craigslist has hurt local classified advertising. One possibility -- peer to peer lending sites, that may do the same to banks.

Fenn said we're headed for a "Real World Web" with the ability to get the information a user wants wherever a use needs it, which depends on devices that know identification, location, their owners, history, safety and envrionment.

There's also a trend toward "augmented reality," information overlaid on actual pictures -- for example, a cell phone that overlays directions or other information on a picture you're taking with the phone in real time. "You could point it at a movie theater and the phone would tell you what movies are playing there," Fenn said.

Users of all these technologies will also leave more and more data trails that could be used to track them, Fenn said.

Fenn also mentioned three-dimensional printing -- devices that bulild three-dimensional objects one thin layer at a time using powder and glue shot out by the same technology that powers a two-dimensional printer. It's been used in industrial prototyping for a decade, but is now coming down in price -- from $200,000 to $10,000. Eventually, Fenn said, "we'll all have these things in our basements" and our kids' kids will be trading three-dimensional representations of their online avatars.

The devices also have lots of implications for retailing -- why go to a store when you can build that replacement part yourself from CAD-CAM data?

Finally, Fenn mentioned human augmentation, technologies that don't just bring an injured person back to normal human function -- but which, like those artificial legs that look like leaf springs, put function over form and make a person improved over normal human function.

Fenn wrapped up by discussing how to pick new technologies, backing a process in her new book, "Mastering The Hype Cycle," called STREET -- for scope, track, rank, evaluate, evangelize and transfer.

The meeting also featured three other brief presentations. Uma Subramanian of IBM talked about Big Blue's view of cloud computing, saying that it consists of elastic scaling, rapid provisioning, advanced virtualization and elastic pricing -- and requires capacity, security and proper licensing.

Eric Eder, president of Royal Oak-based Intelligent Connections, outlined a couple of interesting concepts, including "tightly coupled processes," a series events that cannot be uncoupled once they begin, and which may lead to serious adverse events if the couplings are not known, and "semantic data," data that gets transferred to various locations to create cloud applications. As for security -- his firm's key business -- he said it's a matter of careful content management, ID management and 24-7 monitoring.

Finally, Bob Moesta of Troy-based Edutronix Inc. showed off his company's 3-D printers.

Wednesday, September 17, 2008

Orion Firm Makes Media Software Free To Students

Orion Firm Makes Media Software Free To Students

Orion Township-based Scate Technologies Inc. said Tuesday that its Scate Ignite 4 Standard Edition software, normally priced at $199, will be provided free of charge to every student in the state of Michigan.

Scate Ignite makes it easy for learners of all levers to "stitch" together media content, such as screenshots, movies, PowerPoint slides, images, text and audio, together into a seamless presentation.

Ignite also publishes media creations in multiple formats that can be shared with others on a Web site, in an e-mail, on a learning management system, on a CD-ROM or in print.

Content created with Ignite can also be uploaded to free media sharing devices and social networking sites.

The “Ignite 4 MI Students” initiative is aimed at supporting the Michigan Department of Education Technology Standards for Students (METS-S) where students can conduct Web-based research to write a short essay and easily create a multimedia presentation from images, PowerPoint, audio, video and more.

“I found Ignite 4 to be an easy to learn, intuitive application that not only meets, but often exceeds the capabilities of similar, more expensive applications," said Ron Faulds of the office of educational technology and data coordination at the Michigan Department of Education. "Michigan students now have an opportunity to add this incredibly powerful software to their ‘technology toolbox’ at no cost."

Said Scate president and CEO Stephen Sadler: “We want to do everything we can to provide our students with easy to use software tools so they can share and showcase their talents, skills, and knowledge to a global market. It's time we all do our part to stimulate the Michigan economy, and the best place to start is by helping our children to secure their future."

School technology directors seeking more information on how to register their school, college or university should contact David Rafferty at (248) 371-0315, ext. 121 or e-mail sales@scateignite.com.

Untangle turns ordinary PC into security box

BDPA TAC currently uses the Untangle LINUX version on a church computer lab we set up to filter the web traffic. We will also look into incorporating this version on some future project.

Untangle turns ordinary PC into security box

By John E. Dunn

Open source vendor Untangle has come up with a new version of its gateway that can turn any Windows XP PC into a fully-featured security appliance.

Basically a scaled down version of the company's Linux-based Untangle server, the new 'Re-Router' system puts itself between the network's main router and other PCs, filtering all traffic for a range of security threats.

While all routing functions - including fixed IP addresses and NAT (Network Address Translation) - remain the domain of the main gateway, the Re-Router takes over a range of security functions such as content filtering, anti-virus and anti-spam. If the Re-Router goes down, traffic reverts to running through the original gateway without user intervention until the Untangle box reappears.

The anti-virus uses Kaspersky and the open source ClamAV, with the other security features - intrusion prevention, web filtering, firewalling and VPN - being carried out using open source-based apps tweaked by Untangle. The anti-spam element is a mixture of Bayesian filtering, real-time block lists, and the open-source Razor.

Untangle even claims that the PC used to run all this can continue to be used as normal.

As might be guessed from the concept, the free-to-use software is only suitable for use on small networks, up to around 25 users in size, which would otherwise have to employ several appliances in addition to the main router in order to get the same level of security. The company sees the product as appealing to small companies unable to afford hardware investment and possibly lacking the in-house expertise to configure and manage them.

"For small businesses running Windows, our new Re-Router technology offers a free and totally painless way to leverage all the best networking apps," said Untangle's CTO, Dirk Morris. "There isn't any network reconfiguration or re-cabling, just download the software and you're off and running."

As has become an Untangle speciality, the product is free to use, with the company making money from a professional package of services that covers live support, Active Directory support, more policy management, and configuration backup. It's not clear whether a 25-seat network would need much of this extra handholding at any price, but some networks on the threshold between the Re-Router and the higher-end Untangle Gateway might welcome shopping for upgrades.

The Re-Router can be downloaded from Untangle's website free of charge.

Untangle, formerly called Metavize, attracted headlines earlier this year after lining up a 'fight Club' of porn filtering products at the RSA Show, which set out to demonstrate the ease with which most could be bypassed. The controversial test was eventually won by Fortinet.

Monday, September 15, 2008

TechTown News:New computer lab at Techone site and discount tech training

$75K presidential gift to fund computer training lab

A $75,000 grant from the Wayne State University Office of the President will allow TechTown to create a computer lab in TechOne, its business incubator facility. Former President Irvin D. Reid made the gift in August before leaving office after 10 years.
more

Discount training helps displaced workers

Pixel It Productions

Lori Autrey, executive director of Pixel It Productions, has news for her fellow Detroiters: Michigan has 84,000 job vacancies* that are going unfilled because people lack the necessary skills. In response, the computer software application training company is offering “Pink Slip Productions” -- half-price classes for anyone who has been laid off in the past year and signs up at a special event on Friday, September 19. Classes that normally cost $199 will be offered for just $99 for those who enroll in person that day and show proof that they were laid off in the past year.
more

Hackers deface Large Hadron Collider Web site

Why deface scientific progress?? Humans are more likely to destroy the planet with war than with this scientific experiment. And if we did create a Black Hole, time would come to a complete stop on the planet and for us, the incident could take centuries to complete the destruction of Earth. :-) Maybe we could then travel to the other side of the universe using our newly created Black Hole.


Hackers deface Large Hadron Collider Web site
By Ellen Messmer

Hackers have broken into the network of the Swiss particle-physics laboratory operating the Large Hadron Collider experiment that has just begun smashing atoms in the hope of finding the theorized Higgs particle, an elementary particle of mass.

The hackers, calling themselves the “Greek Security Team,” defaced the CERN Web site with comments made in Greek, according to the Telegraph of London, which reported the incident today. CERN has now taken down this public-facing Web site, and according to the Telegraph, CERN spokesman James Gillies said, “There seems to be no harm done.”

It’s not believed that the Greek Security Team hackers -- thought to be targeting the Compact Muon Solenoid Experiment (CMS) that is using the Large Hadron Collider to make new discoveries about particle physics -- were able to penetrate further into the CERN network of control systems for the giant collider.

The collider is undertaking a ground-breaking experiment to find key particles of matter that scientists hope will help explain the evolution of the universe. The Telegraph quotes CERN spokesman Gillies as acknowledging the hackers made the point that CMS is hackable.

The Greek Security Team is not a generally known hacker group, according to Graham Cluley, senior technology consultant at Sophos who is in the U.K. and following developments regarding the security breach at the CERN facility.

The message the hackers left on the defaced Web page in Greek hasn’t been translated yet, but Cluley said his initial impression is that this hacker activity against the famous particle-physics experiment was done mainly to gain publicity for the hackers, not disrupt the scientist’s work.

“I’d be surprised if they’re trying to disrupt this,” said Cluley. “Of course, if they’re Greeks, we hope they’re not planting Trojans, which Greeks have been known to do historically,"

The Telegraph reports the scientists working with the Large Hadron Collider have received threats via e-mail and phone because some in the public are worried about speculation that the facility could trigger a black hole and swallow up the earth or otherwise cause calamities.

Greed is bad. All the financial collapse in the last few months was cause by greed. I bet you there will be no jail time for any of the "good old boys". Just a slap on the wrist from the Bush Administration. It is time to stop this madness!! The mortgage collapse, bank failures and now the financial giants falling by the way side but no one held accountable. Senator McCain has the nerve to blames the people on the state of the economy!!!! At this rate, there may be no economy for the next President to fix. Are Americans so scared to elect a Black man that they are willing destroy everything??????
Is the economic collapse just bad timing or a planned event to prevent history from being made?

Lehman Brothers Files for Chapter 11 Bankruptcy



Lehman Brothers filed for bankruptcy protection, but the Chapter 11 filing does not include its broker-dealer operations and other units, including Neuberger Berman.

Lehman is looking at selling its broker-dealer operations, and is still in advanced discussions with a number of potential buyers of its investment management division.

Bankruptcy represents the end of a 158-year old company that survived world wars and the collapse of Long-Term Capital Management but could not survive the global credit crunch.

Investors in recent weeks had grown increasingly jittery about Lehman's $46 billion of mortgages and asset-backed securities, as well as its credit rating and its ability to raise capital.

Officials at the Federal Reserve and U.S. Treasury are taking steps to mitigate risk to the system and assure the orderly functioning of the U.S. markets when they open Monday. And the Federal Reserve has also agreed to accept lower-quality assets in return for loans from the government.

Banks Set up $70 Billion Borrowing Facility

Additionally, ten Wall Street banks have agreed to set up a collateralized borrowing facility, and committed to fund for $7 billion each.

* Federal Reserve Statement on Lehman Brothers
* Bank Deposits: What's Insured, What's Not

The banks are Bank of America, Barclays, Citibank, Credit Suisse, Deutsche Bank, Goldman Sachs, JP Morgan, Merrill Lynch, Morgan Stanley, and UBS. These banks have said they are committed to fund $7 billion each for a $70 billion collateralized borrowing facility.

The banks add that they are working together to assist in maximizing market liquidity through ongoing trading relationships, dealer credit terms and capital committed to markets. This will also facilitate the orderly resolution of OTC derivatives exposures between Lehman and its counterparties.

All ten banks say they intend to use expanded federal reserve primary dealers credit facility this week. The banks say their actions reflect "extraordinary market environment".

Wall Street Prepares for Grim Monday

Wall Street traders headed back to their offices Sunday afternoon to prepare for the market impact of a pre-package bankruptcy and the unwinding of Lehman's balance sheet of approximately $700 million.

One Wall Street trader involved in the discussions with officials from the Federal Reserve said every firm had determined their exposure to Lehman by Sunday morning, and were preparing for some Federal assistance in unwinding the trades.

But officials from the Fed said they won't be involved in any such unwind — they told the Wall Street firms to work among themselves to determine how best to settle trades with Lehman. (For more discussion of the Lehman situation, see video).

Bank of America sent a note to derivatives traders Sunday saying "Banks, brokers started netting Lehman trades from 2 p.m. today … trades netted are contingent on Lehman bankruptcy by midnight." The note continued "If no Lehman bankruptcy, netting of trades to be cancelled," meaning Bank of America's assumption of Lehman’s side of trades would end.

"It’s a way of lessening the pressure before Wall Street opens up tomorrow. The more they can reduce the total brokerage book for Lehman, the less heart-ache there will be for counterparties if Lehman files," Carlos Mendez, senior managing director of ICP Capital in New York.

The International Swaps and Derivatives Association called a special session from 2 p.m. to 4 p.m. but traders said that was purely symbolic. They intended to trade through the night.

The cost of insuring the bonds of investment bankers blew out in trading on Sunday.

Barclays Pulls Out

Earlier in the day, the United Kingdom's Barclays Bank pulled out of talks to buy Lehman. Barclays, which was considered the lead candidate to buy Lehman, reportedly was unable to agree on credit guarantees to shield them from potential losses.

The Fall of Lehman Brothers



* Fed Braces Markets for Financial Storm
* POLL: What Should the Fed Do?
* Lehman's Asia Staff Sit and Wait
* Lehman CEO's Star Fades
* Fed's Statement on Lehman
* BofA Buys Merrill for $29 a Share
* AIG Reaches Out to Fed
* Farrell: Parsing the Faltering Financials
* Paulson's Statement on Markets

Top Wall Street executives arrived Sunday morning for another round of talks to resolve the Lehman crisis, and sources said the group continued to work on how to handle the possibility of a deal not getting done before Monday.

By mid-morning, Federal Reserve Chairman Ben Bernanke was said to have been involved in several conversations by phone from Washington with officials meeting at the New York Federal Reserve. In addition, Bernanke was said to have made several calls already to foreign central bankers who are monitoring the proceedings carefully.

New York Federal Reserve President Tim Geithner and Treasury Secretary Hank Paulson were already at the New York Fed by the time executives from top Wall Street firms began to arrive.

Work went on through the night on a deal drafted Saturday to have a consortium of banks backstop Lehman's bad assets and sell off the rest of the bank to Bank of America and Barclays. But sources said key parts of the deal remained controversial Sunday morning. As reported, the banks backstopping the bad loans were said to be balking at the amount of capital required of the banks and the sense that they were supporting a good deal for Barclays and Bank of America.

The larger group has been broken up into several working groups to devise responses to different possible outcomes. Among those, how markets can prepare for the possibility that Lehman might not find a buyer before Monday.

Friday, September 12, 2008

Video explaination of Web 2.0

Sub-Saharan universities train in Web 2.0 tools

By Edris Kisambira

Representatives from 12 universities in eight sub-Saharan countries are meeting in Uganda to train in the use of Web 2.0 tools and open-source software.

The training in Web 2.0 tools is aimed at enabling the 35 lecturers and professors to carry out collaborative research in the field of agriculture, while the training in open-source software seeks to popularize the medium among higher institutions of learning.

"We are here to learn about Web 2.0 tools to enable us to do collaborative research projects, to teach using these tools and to also do a number of other collaborative projects without having to meet physically," Nodumo Dhlamini, ICT director at Zimbabwe's Africa University, said in an interview. "At the end of the day, we want to impact on the rural farmer, to make a difference and make ourselves relevant, as well."

Those involved in the training hail from universities in Ethiopia, Kenya, Malawi, Mozambique, Tanzania, Uganda, Zambia and Zimbabwe.

The tools that members are familiarizing themselves with include wikis, blogs, Skype and social-networking sites like Facebook.com.

"Right now, we have set up a wiki so that when we go back to our respective countries, any collaborative activity will be posted there so that we can share in what everybody is doing," Dhlamini said.

"We are creating more of an educational network to tap into resources," she added, "and this will also open us up to the rest of the world so that people out there know what we are doing."

The training in open-source software is designed to create awareness among educational institutions of a cheaper and more secure alternative to the pirated software that many universities have installed.

Dhlamini explained that startup costs for proprietary software are high for large universities, yet the schools can end up with fake software, which is susceptible to virus attacks.

"It is not easy to migrate institutions of higher learning at once, but we are telling the members who are here to adopt it at individual levels as a starting point so that they can then sell it on a bigger scale to the university after realizing its benefits," she said.

Dhlamini said that Africa has been left behind in the adoption of open-source software due to the digital divide. Nevertheless, she said, the high cost of software makes it necessary for African universities to take advantage of the opportunity.

"Free open-source software can help us tailor-make software that will address our needs," Dhlamini said.

Also as a part of the weeklong training program, participants have been asked to initiate digital content projects to fulfill Africa's educational needs, as most online content is developed elsewhere.

"We are set to acquire optic fiber, which will guarantee broadband speeds, but what will happen once fiber is in place?" wondered Nicholas Kimolo of Kenya's Floss4Edu. "What will run on that fiber? Will we continue running content from outside Africa?"

He said that Africa runs the risk of losing local knowledge and culture if the issue of local digital content is not addressed.

The world is becoming a knowledge society, Kimolo said, and Africa must start participating in the knowledge economy.

Wednesday, September 10, 2008

Universe still here after CERN collider was fired up today??

Hopefully this is the same universe that was here before the collider came online. If not, please note any differences and post to the internet. :-))

Tuesday, September 09, 2008

The "real" Star Trek PADD

An e-book reader thin as a magazine. Black and White today hopefully color in a few months.

Monday, September 08, 2008

Friday, September 05, 2008

Citibank Had A Program To Take Money From Customers?

From Techdirt.com. This is an outrage. And people talk about the Mayor of Detroit being crooked!! I guess it is OK for the "Good old Boys" to take money and get NO jail time.

Citibank Had A Program To Take Money From Customers?

from the positive-balance-means-it's-ours dept

A few years back, that I accidentally added an extra zero to a bill I paid for phone service. The company automatically credited the account, and a quick call got them to send a check with the overpaid amount. I know others who have accidentally paid a bill twice, or simply overpaid a bill because they didn't have the exact amount of the bill handy and wasn't able to look up the specifics. In most cases, the companies in question would just credit the difference. However, it turns out that Citibank had a different idea. It apparently decided that if you overpaid a bill, you really were just donating free money to Citibank executives' bonus fund. The company actually had a "sweeping" software that would scan customer accounts for a positive credit and simply wipe it off their account, transferring the money to Citibank's general account. This wasn't just a small thing either -- it went on for more than a decade, and the whistleblower who brought it up was fired. And, if you thought I was joking about the executive bonus fund, an executive from Citibank told investigators: "the sweep program could not be stopped because it would reduce the executive bonus pool." Of course, now the state of California has "convinced" Citibank

Thursday, September 04, 2008

Mayor down for the count, now the domino effect.


Mayor Kilpatrick has resigned and pleaded guilty to the felony charges. Also Police Chief Ella-Bully Cummings has announced her retirement. Well, let the dominoes start to fall. Expect more announcement of resignations or retirements from the City offices.

Next up the Detroit Public School system and 400 million dollars in misused funds
. Then the debacle of City Counsel and its misconduct. The Mayor was only the tip of the Ice Berg of legal and misconduct. But will voters learn to stop voting on name recognition and vote on the issues and character????(Especially City Counsel members)

LINUX vs Windows

10 fundamental differences between Linux and Windows

by Jack Wallen


I have been around the Linux community for more than 10 years now. From the very beginning, I have known that there are basic differences between Linux and Windows that will always set them apart. This is not, in the least, to say one is better than the other. It’s just to say that they are fundamentally different. Many people, looking from the view of one operating system or the other, don’t quite get the differences between these two powerhouses. So I decided it might serve the public well to list 10 of the primary differences between Linux and Windows.

#1: Full access vs. no access

Having access to the source code is probably the single most significant difference between Linux and Windows. The fact that Linux belongs to the GNU Public License ensures that users (of all sorts) can access (and alter) the code to the very kernel that serves as the foundation of the Linux operating system. You want to peer at the Windows code? Good luck. Unless you are a member of a very select (and elite, to many) group, you will never lay eyes on code making up the Windows operating system.

You can look at this from both sides of the fence. Some say giving the public access to the code opens the operating system (and the software that runs on top of it) to malicious developers who will take advantage of any weakness they find. Others say that having full access to the code helps bring about faster improvements and bug fixes to keep those malicious developers from being able to bring the system down. I have, on occasion, dipped into the code of one Linux application or another, and when all was said and done, was happy with the results. Could I have done that with a closed-source Windows application? No.

#2: Licensing freedom vs. licensing restrictions

Along with access comes the difference between the licenses. I’m sure that every IT professional could go on and on about licensing of PC software. But let’s just look at the key aspect of the licenses (without getting into legalese). With a Linux GPL-licensed operating system, you are free to modify that software and use and even republish or sell it (so long as you make the code available). Also, with the GPL, you can download a single copy of a Linux distribution (or application) and install it on as many machines as you like. With the Microsoft license, you can do none of the above. You are bound to the number of licenses you purchase, so if you purchase 10 licenses, you can legally install that operating system (or application) on only 10 machines.

#3: Online peer support vs. paid help-desk support

This is one issue where most companies turn their backs on Linux. But it’s really not necessary. With Linux, you have the support of a huge community via forums, online search, and plenty of dedicated Web sites. And of course, if you feel the need, you can purchase support contracts from some of the bigger Linux companies (Red Hat and Novell for instance).

However, when you use the peer support inherent in Linux, you do fall prey to time. You could have an issue with something, send out e-mail to a mailing list or post on a forum, and within 10 minutes be flooded with suggestions. Or these suggestions could take hours of days to come in. It seems all up to chance sometimes. Still, generally speaking, most problems with Linux have been encountered and documented. So chances are good you’ll find your solution fairly quickly.

On the other side of the coin is support for Windows. Yes, you can go the same route with Microsoft and depend upon your peers for solutions. There are just as many help sites/lists/forums for Windows as there are for Linux. And you can purchase support from Microsoft itself. Most corporate higher-ups easily fall victim to the safety net that having a support contract brings. But most higher-ups haven’t had to depend up on said support contract. Of the various people I know who have used either a Linux paid support contract or a Microsoft paid support contract, I can’t say one was more pleased than the other. This of course begs the question “Why do so many say that Microsoft support is superior to Linux paid support?”

#4: Full vs. partial hardware support

One issue that is slowly becoming nonexistent is hardware support. Years ago, if you wanted to install Linux on a machine you had to make sure you hand-picked each piece of hardware or your installation would not work 100 percent. I can remember, back in 1997-ish, trying to figure out why I couldn’t get Caldera Linux or Red Hat Linux to see my modem. After much looking around, I found I was the proud owner of a Winmodem. So I had to go out and purchase a US Robotics external modem because that was the one modem I knew would work. This is not so much the case now. You can grab a PC (or laptop) and most likely get one or more Linux distributions to install and work nearly 100 percent. But there are still some exceptions. For instance, hibernate/suspend remains a problem with many laptops, although it has come a long way.

With Windows, you know that most every piece of hardware will work with the operating system. Of course, there are times (and I have experienced this over and over) when you will wind up spending much of the day searching for the correct drivers for that piece of hardware you no longer have the install disk for. But you can go out and buy that 10-cent Ethernet card and know it’ll work on your machine (so long as you have, or can find, the drivers). You also can rest assured that when you purchase that insanely powerful graphics card, you will probably be able to take full advantage of its power.

#5: Command line vs. no command line

No matter how far the Linux operating system has come and how amazing the desktop environment becomes, the command line will always be an invaluable tool for administration purposes. Nothing will ever replace my favorite text-based editor, ssh, and any given command-line tool. I can’t imagine administering a Linux machine without the command line. But for the end user — not so much. You could use a Linux machine for years and never touch the command line. Same with Windows. You can still use the command line with Windows, but not nearly to the extent as with Linux. And Microsoft tends to obfuscate the command prompt from users. Without going to Run and entering cmd (or command, or whichever it is these days), the user won’t even know the command-line tool exists. And if a user does get the Windows command line up and running, how useful is it really?

#6: Centralized vs. noncentralized application installation

The heading for this point might have thrown you for a loop. But let’s think about this for a second. With Linux you have (with nearly every distribution) a centralized location where you can search for, add, or remove software. I’m talking about package management systems, such as Synaptic. With Synaptic, you can open up one tool, search for an application (or group of applications), and install that application without having to do any Web searching (or purchasing).

Windows has nothing like this. With Windows, you must know where to find the software you want to install, download the software (or put the CD into your machine), and run setup.exe or install.exe with a simple double-click. For many years, it was thought that installing applications on Windows was far easier than on Linux. And for many years, that thought was right on target. Not so much now. Installation under Linux is simple, painless, and centralized.

#7: Flexibility vs. rigidity

I always compare Linux (especially the desktop) and Windows to a room where the floor and ceiling are either movable or not. With Linux, you have a room where the floor and ceiling can be raised or lowered, at will, as high or low as you want to make them. With Windows, that floor and ceiling are immovable. You can’t go further than Microsoft has deemed it necessary to go.

Take, for instance, the desktop. Unless you are willing to pay for and install a third-party application that can alter the desktop appearance, with Windows you are stuck with what Microsoft has declared is the ideal desktop for you. With Linux, you can pretty much make your desktop look and feel exactly how you want/need. You can have as much or as little on your desktop as you want. From simple flat Fluxbox to a full-blown 3D Compiz experience, the Linux desktop is as flexible an environment as there is on a computer.

#8: Fanboys vs. corporate types

I wanted to add this because even though Linux has reached well beyond its school-project roots, Linux users tend to be soapbox-dwelling fanatics who are quick to spout off about why you should be choosing Linux over Windows. I am guilty of this on a daily basis (I try hard to recruit new fanboys/girls), and it’s a badge I wear proudly. Of course, this is seen as less than professional by some. After all, why would something worthy of a corporate environment have or need cheerleaders? Shouldn’t the software sell itself? Because of the open source nature of Linux, it has to make do without the help of the marketing budgets and deep pockets of Microsoft. With that comes the need for fans to help spread the word. And word of mouth is the best friend of Linux.

Some see the fanaticism as the same college-level hoorah that keeps Linux in the basements for LUG meetings and science projects. But I beg to differ. Another company, thanks to the phenomenon of a simple music player and phone, has fallen into the same fanboy fanaticism, and yet that company’s image has not been besmirched because of that fanaticism. Windows does not have these same fans. Instead, Windows has a league of paper-certified administrators who believe the hype when they hear the misrepresented market share numbers reassuring them they will be employable until the end of time.

#9: Automated vs. nonautomated removable media

I remember the days of old when you had to mount your floppy to use it and unmount it to remove it. Well, those times are drawing to a close — but not completely. One issue that plagues new Linux users is how removable media is used. The idea of having to manually “mount” a CD drive to access the contents of a CD is completely foreign to new users. There is a reason this is the way it is. Because Linux has always been a multiuser platform, it was thought that forcing a user to mount a media to use it would keep the user’s files from being overwritten by another user. Think about it: On a multiuser system, if everyone had instant access to a disk that had been inserted, what would stop them from deleting or overwriting a file you had just added to the media? Things have now evolved to the point where Linux subsystems are set up so that you can use a removable device in the same way you use them in Windows. But it’s not the norm. And besides, who doesn’t want to manually edit the /etc/fstab fle?

#10: Multilayered run levels vs. a single-layered run level

I couldn’t figure out how best to title this point, so I went with a description. What I’m talking about is Linux’ inherent ability to stop at different run levels. With this, you can work from either the command line (run level 3) or the GUI (run level 5). This can really save your socks when X Windows is fubared and you need to figure out the problem. You can do this by booting into run level 3, logging in as root, and finding/fixing the problem.

With Windows, you’re lucky to get to a command line via safe mode — and then you may or may not have the tools you need to fix the problem. In Linux, even in run level 3, you can still get and install a tool to help you out (hello apt-get install APPLICATION via the command line). Having different run levels is helpful in another way. Say the machine in question is a Web or mail server. You want to give it all the memory you have, so you don’t want the machine to boot into run level 5. However, there are times when you do want the GUI for administrative purposes (even though you can fully administer a Linux server from the command line). Because you can run the startx command from the command line at run level 3, you can still start up X Windows and have your GUI as well. With Windows, you are stuck at the Graphical run level unless you hit a serious problem.

Your call…

Those are 10 fundamental differences between Linux and Windows. You can decide for yourself whether you think those differences give the advantage to one operating system or the other. Me? Well I think my reputation (and opinion) precedes me, so I probably don’t need to say I feel strongly that the advantage leans toward Linux.

Tuesday, September 02, 2008

Computer worm strikes International Space Station

Oh no. they have attacked the starbase. The BORG are here...

Computer worm strikes International Space Station

----
It sounds like the proposal for a Hollywood sci-fi movie, but it has been
revealed that astronauts in orbit around the Earth have been battling computer viruses. Keep your feet on the ground about the threat, and read more about
the incident now.
http://www.sophos.com/blogs/gc/g/2008/08/27/computer-worm-strikes

Comcast sets monthly bandwidth limit for customers

Comcast sets monthly bandwidth limit for customers
By Grant Gross

Comcast, the largest provider of cable-based broadband service in the U.S., will limit residential customers to 250 gigabytes of bandwidth a month beginning Oct. 1, the company announced late Thursday.

Comcast will contact customers who go above the 250G byte limit and ask them to curtail their use, Comcast said. If a customer goes over the monthly limit again during the following six months, Comcast will suspend service for a year.

Currently, Comcast contacts high-bandwidth customers and will suspend their accounts if they don't curb their use, but it has not set a firm bandwidth limit until now. Most customers contacted about their bandwidth usage agree to limit their activity, according to Charlie Douglas, Comcast's director of communications.

Earlier this month, the U.S. Federal Communications Commission struck down Comcast's past network management practice of slowing BitTorrent peer-to-peer traffic in an effort to reduce congestion. The FCC ruled that Comcast was violating so-called net neutrality principles by targeting a certain kind of Internet traffic.

The new bandwidth cap will affect less than 1 percent of Comcast customers, Douglas said. Those customers "are using so much bandwidth that they are degrading the experience of other users," he added. "Two-hundred-and-fifty gigabytes is an extremely large amount of data."

Some high-bandwidth users have asked Comcast to identify a specific cap so they know where the line is, Douglas added. Some other broadband providers also warn customers about excessive bandwidth use.

An average Comcast customer uses two to three gigabytes of bandwidth a month, Comcast said. To reach the 250G-byte limit, a customer would have to do one of the following: send 50 million e-mails, download 62,500 songs or download 125 standard-definition movies, the company said in its announcement.

Comcast has also looked at charging high-bandwidth users additional fees, and it still has not ruled out doing so in the future, Douglas said.

Comcast is also looking at "de-prioritizing" heavy users' traffic during times of network congestion. The plan Comcast is considering would slow heavy users' traffic for up to 20 minutes during times of the most congestion.

Comcast will notify customers of the new bandwidth limits using several methods, including banner ads at Comcast.net and notices sent with monthly bills, the company said. Some net neutrality advocates criticized Comcast for not telling customers of its previous network management plan to slow P-to-P traffic at times.

Some net neutrality advocates have said Comcast's new network management plans of targeting individual users is preferable to blocking Web applications. But others have suggested that those efforts may be equal to penalizing their best customers.