View Single Post
  #1  
Old 30-12-2011, 07:01 PM
.BZU.'s Avatar
.BZU. .BZU. is offline


 
Join Date: Sep 2007
Location: near Govt College of Science Multan Pakistan
Posts: 9,693
Contact Number: Removed
Program / Discipline: BSIT
Class Roll Number: 07-15
.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute.BZU. has a reputation beyond repute
Not Matter 12 technology giants who left us in this year 2011

Steve Jobs died this year we all know that, Dennis Ritchie, one of the creators of the Unix operating system that powers Apple's computers. Also departed are Jack Goldman, the man who came up with the idea to start Xerox PARC, where Jobs got his graphical user interface, Paul Baran, an important Internet pioneer who developed packet switching, and Nobutoshi Kihara, who was known as Sony's "Mr. Walkman," long before Jobs was flogging the iPod.

So here are 12 technology giants who left us this year, and the amazing legacies they left behind.

Steve Jobs

Steven Paul Jobs, 56, died Wednesday at his home with his family. The co-founder and, until last August, CEO of Apple Inc was the most celebrated person in technology and business on the planet. No one will take issue with the official Apple statement that “The world is immeasurably better because of Steve.”





Dennis Ritchie
Name:  Dennis Ritchie.jpg
Views: 1222
Size:  89.5 KB
Dennis Ritchie is one of those little-hailed computer science pioneers whose work has ended up influencing everyone in technology. If your computer isn't running some component of Unix, the operating system he helped build, it's probably using software written in C, the language he created.

"It’s really hard to overstate how much of the modern information economy is built on the work Dennis did," said Rob Pike, a Google distinguished engineer who once worked across the hall from Ritchie at Bell Labs.

Gentle and far-from-flashy, Ritchie was in some ways the opposite of Jobs. He watched Unix and C take off because of their technical superiority, and he never cashed in on the success of his incredibly popular software. He stayed at Bell Labs for 40 years because he liked working with scientists and being able to stumble into canisters of liquid helium while at work.

Photo: Dennis Ritchie (standing) and Ken Thompson at a PDP-11 in 1972. Courtesy of Bell Labs


Ken Olsen
Name:  Ken olsen.jpg
Views: 902
Size:  87.2 KB
The first computer to run Unix was a PDP-7, and the company that built it was Digital Equipment Corp. (DEC), which was co-founded by Ken Olsen.

Olsen wasn't exactly a fan of Unix, but he was a computer pioneer back in the '50s -- starting up DEC with Harlan Anderson in an old wool mill in Maynard, MA. Thirty years later, Fortune magazine called him "arguably the most successful entrepreneur in the history of American business."

Back in the '80s, DEC was one of the hottest technology companies around. As DEC's longstanding president, Olsen built DEC into the country's number-two computer company, before it was finally swallowed by Compaq, which was itself consumed by Hewlett-Packard. DEC was a victim of the PC revolution that ultimately killed off the company's lucrative PDP and Vax systems.

Photo: Rick Friedman/Corbis



Paul Baran
Name:  Paul baran.jpg
Views: 1110
Size:  281.3 KB
There's a myth that the ARPAnet -- the precursor to today's Internet -- was designed specifically to withstand nuclear strikes. In fact, the first ARPAnet researchers were really just trying to figure out a way to make two computers talk to each other. We know this because Paul Baran -- the former RAND researcher who invented packet switching -- told us so back in 2001.

Baran was a prolific and wide-ranging thinker and inventor who started companies and developed technology for printers, modems, satellite transmissions, and even metal detectors, the LA Times wrote in his obituary. His son, David Baran, told the Times that the LA Police department used to slip Paul Baran guns so he could test a metal detector he'd designed. That metal detector was later adopted by the Federal Aviation Administration, the Times said.

But packet switching may be his most important legacy. It's a way of chopping up data into little bursts that can then be reassembled by computers on the other end. It's how the Internet works. Incidentally, two other researchers -- Donald Davies, who actually coined the term "packets," and Leonard Kleinrock, have been credited with inventing packet switching. That's also how the Internet works.

Photo: President George W. Bush presents Paul Baran a 2007 National Medal of Technology and Innovation. AP/Charles Dharapak



Jacob Goldman
Name:  Jacob goldman.jpg
Views: 960
Size:  100.2 KB
You could argue that the PC as we know it wouldn't exist without Jack Goldman. After all, he was the guy who in 1969 came up with the idea of setting up an independent research lab for copier-maker Xerox Corp. Xerox had just shelled out an astonishing $918 million for Scientific Data Systems, and Goldman knew that computers were the future.

He wanted to set up a pure research facility -- modeled on AT&T's Bell Labs -- far away from corporate meddling, where researchers could dream up future technologies. And that's exactly what they did, inventing the graphical user interface, Ethernet, the laser printer, and object-oriented programming.

How important was Xerox PARC? Walter Isaacson nails it in a scene from his Steve Jobs biography. When Apple's CEO accuses Bill Gates of stealing the Apple graphical user interface, Gates says it was "more like we had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it."

Goldman, a former Ford Motor Company researcher, didn't run Xerox PARC (Palo Alto Research Center). That was physicist George Pake's job. But without Goldman, it never would have happened.



John McCarthy
Name:  John mccarthy.jpg
Views: 1098
Size:  286.8 KB
He coined the term "artificial intelligence," but to programmers, John McCarthy was more impressive for creating Lisp -- one of the oldest computing languages still in use -- and for being a major player in the development of time-sharing computer systems.

He also influenced several generations of computer geeks, both at the Massachusetts Institute of Technology (MIT), where he got his start, and at the Stanford Artificial Intelligence Lab (SAIL), which he founded in the early 1960s.

An AI purist, McCarthy was decidedly unimpressed when IBM's Big Blue beat chess champion Garry Kasparov in 1997. “He believed in artificial intelligence in terms of building an artifact that could actually replicate human level intelligence, and because of this, we was very unhappy with a lot of AI today, " Stanford AI Lab Professor Daphne Koller told us earlier this year. “He wanted AI to pass the Turing test.”

Photo: Stanford

Photo: Xerox



John R. Opel
Name:  John r opel.jpg
Views: 1003
Size:  389.1 KB
In 1980, when IBM Chairman John Opel was a board member with the United Way, he got into a discussion with fellow board member Mary Gates. Opel's company was getting set to introduce its personal computer and was looking for an operating system. Just by coincidence, Mary Gates knew of someone who had a line on an OS: her son, Bill.

The rest, as they say, is history.

A lifelong IBMer, Opel ushered IBM's PC into the market, taking over as CEO of the company in 1981, and when he left four years later, IBM was in pretty good shape. The U.S. Department of Justice had dropped a long-running antitrust investigation and revenue had nearly doubled.

Photo: IBM


Ashawna Hailey
Name:  Ashawna hailey .jpg
Views: 925
Size:  167.9 KB
It's easy to forget that AMD has been slugging it out with Intel since the early 1970s. In fact, the company's first Intel-compatible chip, the 9080, was built in 1974, by a team led by Ashawna Hailey.

At the time, Ashawna was known as Shawn. She changed her gender, and her name, after retiring from the computer industry and reinventing herself as an Independent Entertainment Professional.

In a 1997 interview, Hailey remembered her salad days with the AMD team, saying, "We always said we started the microprocessor war between AMD and Intel, which we quite literally did."

Photo: Sarah Maren/Rainforest Action Network/Flickr



Jean Bartik
Name:  Jean bartik.jpg
Views: 856
Size:  96.6 KB
A lot of people know that the ENIAC (Electronic Numerical Integrator and Computer) was the world's first electronic computer. Less well-known is that fact that most of the ENIAC programming was done by a crackerjack team of six women.

The last living member of that team, Jean Bartik, died this year.

A math major at Northwest Missouri State University, she got the job in 1945 after seeing an advertising in a math journal.

She and the five other women mastered ENIAC's less-than-friendly user interface to do the programming that helped the giant computing machine calculate artillery trajectories.

For most of Bartik's life, hers was an unheralded contribution to computer science, but that's changed in the past few years. In 2008, she was made a fellow of the Computer History Museum. A year later she received the prestigious IEEE Computer Society Pioneer Award.

Photo: Jean Bartik (left) at ENIAC's console with co-worker Frances Bilas. U.S. Army/Army Research Laboratory's Technical Library.



Nobutoshi Kihara
Name:  Nobutoshi kihara.jpg
Views: 838
Size:  96.1 KB
They called him "the Wizard of Sony" and "Mr. Walkman," but he'd probably rather have been remembered as "Mr. Betamax."

Betamax was one of hundreds of inventions that Nobutoshi Kihara came up with while he was Sony's chief inventor, and its video-standards-wars loss to rival format VHS is something that Kihara never quite got over. How did he feel about it? "My blood boils,” is the quote used in his New York Times obituary.

Despite the Mr. Walkman moniker, Kihara didn't actually create the Walkman. But without his groundbreaking work on magnetic tape recorders, videotape, and digital photography, the Walkman -- and, indeed, Sony as we know it -- would simply not exist.

Photo: Nobutoshi Kihara, right, works with Sony co-founder Masaru Ibuka, left, in 1963. Sony Corp.


Robert Galvin
Name:  Robert galvin.jpg
Views: 988
Size:  179.5 KB
He ran Motorola for nearly 30 years, ushering in the world's first portable telephone and turning the small two-way radio manufacturing company he inherited from his father into the world's leader in cellular phones.

"He probably single-handedly provided this firm with more leadership and guided it through more innovation than any other single person in our 83-year history," Motorola Solutions CEO Greg Brown told USA Today after Galvin's death.

Photo: Motorola



Charles Walton
Name:  Charles walton.jpg
Views: 861
Size:  101.7 KB
Charles Walton patented the technology used in RFID chips in the 1970s. Back then, RFID was too expensive to compete with its main rival -- bar-code scanning -- but today, the tiny wireless radio devices are used by retailers everywhere to keep track of their inventory.

There is most likely an RFID tag in your passport, and the devices are almost everywhere nowadays. You can find them in everything from casino poker chips to luggage tags to barnyard animals.

But Walton didn't make a ton of money from RFID. The technology didn't really take off until after his patents had expired. Still, he made a few million bucks -- enough to keep doing the technology work that he loved, according to Venture Beat. "I feel good about it and gratified I could make a contribution," he said about his invention.
__________________
(¯`v´¯)
`*.¸.*`

¸.*´¸.*´¨) ¸.*´¨)
(¸.*´ (¸.
Bzu Forum

Don't cry because it's over, smile because it happened
Reply With Quote