A computer virus has infected the cockpits of America’s Predator and Reaper drones, logging pilots’ every keystroke as they remotely fly missions over Afghanistan and other warzones.
The virus, first detected nearly two weeks ago by the military’sHost-Based Security System, has not prevented pilots at Creech Air Force Base in Nevada from flying their missions overseas. Nor have there been any confirmed incidents of classified information being lost or sent to an outside source. But the virus has resisted multiple efforts to remove it from Creech’s computers, network security specialists say. And the infection underscores the ongoing security risks in what has become the U.S. military’s most important weapons system.
“We keep wiping it off, and it keeps coming back,” says a source familiar with the network infection, one of three that told Danger Room about the virus. “We think it’s benign. But we just don’t know.”
Military network security specialists aren’t sure whether the virus and its so-called “keylogger” payload were introduced intentionally or by accident; it may be a common piece of malware that just happened to make its way into these sensitive networks. The specialists don’t know exactly how far the virus has spread. But they’re sure that the infection has hit both classified and unclassified machines at Creech. That raises the possibility, at least, that secret data may have been captured by the keylogger, and then transmitted over the public internet to someone outside the military chain of command.
Drones have become America’s tool of choice in both its conventional and shadow wars, allowing U.S. forces to attack targets and spy on its foes without risking American lives. Since President Obama assumed office, a fleet of approximately 30 CIA-directed drones have hit targets in Pakistanmore than 230 times; all told, these drones havekilled more than 2,000 suspected militants and civilians, according to theWashington Post. More than 150 additional Predator and Reaper drones, under U.S. Air Force control, watch over the fighting in Afghanistan and Iraq. American military dronesstruck 92 timesin Libya between mid-April and late August. And late last month, an American dronekilled top terrorist Anwar al-Awlaki— part of anescalating unmanned air assaultin the Horn of Africa and southern Arabian peninsula.
But despite their widespread use, the drone systems are known to have security flaws. Many Reapers and Predators don’t encrypt the video they transmit to American troops on the ground. In the summer of 2009, U.S. forces discovered “days and days and hours and hours” of the drone footage on the laptops of Iraqi insurgents. A$26 piece of software allowed the militants to capture the video.
Thelion’s share of U.S. drone missionsare flown by Air Force pilots stationed atCreech, a tiny outpost in the barren Nevada desert, 20 miles north of a state prison and adjacent to a one-story casino. In a nondescript building, down a largely unmarked hallway, is a series of rooms, each with a rack of servers and a “ground control station,” or GCS. There, a drone pilot and a sensor operator sit in their flight suits in front of a series of screens. In the pilot’s hand is the joystick, guiding the drone as it soars above Afghanistan, Iraq, or some other battlefield.
Some of the GCSs are classified secret, and used for conventional warzone surveillance duty. The GCSs handling more exotic operations are top secret. None of the remote cockpits are supposed to be connected to the public internet. Which means they are supposed to be largely immune to viruses and other network security threats.
Use of the drives is now severely restricted throughout the military. But the base at Creech was one of the exceptions, until the virus hit. Predator and Reaper crews use removable hard drives to load map updates and transport mission videos from one computer to another. The virus is believed to have spread through these removable drives. Drone units at other Air Force bases worldwide have now been ordered to stop their use.
In the meantime, technicians at Creech are trying to get the virus off the GCS machines. It has not been easy. At first, they followed removal instructions posted on the website of the Kaspersky security firm. “But the virus kept coming back,” a source familiar with the infection says. Eventually, the technicians had to use a software tool calledBCWipeto completely erase the GCS’ internal hard drives. “That meant rebuilding them from scratch” — a time-consuming effort.
The Air Force declined to comment directly on the virus. “We generally do not discuss specific vulnerabilities, threats, or responses to our computer networks, since that helps people looking to exploit or attack our systems to refine their approach,” says Lt. Col. Tadd Sholtis, a spokesman for Air Combat Command, which oversees the drones and all other Air Force tactical aircraft. “We invest a lot in protecting and monitoring our systems to counter threats and ensure security, which includes a comprehensive response to viruses, worms, and other malware we discover.”
However, insiders say that senior officers at Creech are being briefed daily on the virus.
“It’s getting a lot of attention,” the source says. “But no one’s panicking. Yet.”
They number in the thousands now, mostly college kids and young adults. They’ve occupied prime public downtown spaces. They’re marching, holding up signs for the cameras, handing out literature, making and listening to speeches. Drumming and chanting, heard for blocks around, is perpetual background noise. A sea of blue plastic tarps dot the area, covering mattresses, sleeping bags and the meager possessions of protesters in what has become something of a tent city.
The dissidents have moved in. The police have already used pepper spray, batons and made hundreds of arrests. The nation’s leaders, and those who aspire to high office, are now being forced now to weigh in.
The whole world is watching, and waiting. There is no way of knowing when, or perhaps more troublingly, how this will all end.
‘We get 20 tweets every 10-15 seconds, so it’s hard to keep up. You aren’t even finished reading the 20 and you get 20 moreand it just keeps going. This is how it’s been for 18 days’
But this isn’t Cairo or Tunis, where earlier this yearmass protests ignited the “Arab Spring”and the Tunisian and Egyptian people overthrew dictators in a matter of (semi) peaceful weeks. This is Liberty Square in downtown Manhattan, the centre of American finance. The movement is “Occupy Wall Street.” The sound bite is “we are the 99%.” The demands are … unclear.
OWS — #occupywallstreet in the parlance that really matters in the age of Twitter — started small. On September 17, the first day of the protests, the fledgling group’s one loudspeaker was stolen mischievously by a passer-by, organizers were clueless when asked how food would be distributed, and police were content just to watch and wait. The feeling for some was that there wouldn’t be a real revolution unless they were actually being attacked. In the dismayed words of one Anonymous member there, “this is a hippie jam fest.”
But the demonstration grew and matured over three weeks, and isnow spreading to other cities, even overseas. There istalkof an “Occupy the FED” demonstration— and the similarities earlier Middle East uprisings don’t end there: Many got the word to join the protest via social media, and organizers are maintaining momentum and keeping tabs on thing the same way
“We
get 20 tweets every 10-15 seconds, so it’s hard to keep up. You aren’t
even finished reading the 20 and you get 20 more and it just keeps
going. This is how it’s been for 18 days,” Eric Gibbs, one of the
volunteers manning what appeared to be the command center of the
movement, told Wired.com when we visited this week.
“What
impresses me about media coverage of #occupywallstreet is how
inattentive it is to a sleeping factor: the social media ignition
moment,” saidone of those Tweets, from New York University journalism professor and social media watcher Jay Rosen.
It’s impossible not to make the Arab Spring analogy, especially when the organization’s own site openly encourages a favorable comparison. “We are using the revolutionaryArab Springtactic to achieve our ends and encourage the use of nonviolence to maximize the safety of all participants,” reads the site. And, of course the words “Arab Spring” link to the Wikipedia page, in caseyou missed all that earlier this year. Like the Spring, there is no charismatic leader, a sound bite and a plurality of populist voices. This was successful and North Africa and is proving successful here.
‘What impresses me about media coverage of #occupywallstreet is how inattentive it is to a sleeping factor: the social media ignition moment’ — Jay Rosen
Gibbs stands in front of four computers pushed together on a card table, all equipped with web cams and with Livestream and Twitter pages open on three of the four. The fourth was being used to write a speech for the march. From below yet another tarp comes the buzz of generator, powering this DIY tech-hub, but it is kept from sight due to police prohibition.
Gibbs easily confesses his inexperience in political activism, but believes that most people here are like himself, called to action via the internet. In his case, this was via occupywallst.org’s chat room, where found himself in a room of foreigners who wanted to know what was going on. They told him to get downtown.
Gibbs opened a Twitter account on day one of the occupation, and re-posted #OccupyWallStreet’s activity into the chat. Now, at day 20, he is one of the many voices coming from Liberty Square, keeping the world informed, play-by-play.
It’s hard to hear him over din of what might be called the embodiment of their social media strategy: A comrade is shouting into his computer, answering questions from the Livestream chat room run from the Occupy Wall Street site. His pace is rapid-fire and his responses are passionate.
This is really how the world is watching: On the marches, they bring these computers, USB hot-spots plugged in, and show the world what is happening, as it happens. “Before we didn’t even have the webcams, we were just marching around with the computer facing the crowd, it was kind of weird,” Gibbs laughed.
Twitter and Facebook proved to be, and continue to be, a way to move information quickly, personally, and largely under the radar. Feeds and pages have materialized all over the country in solidarity of the OWS movement, each group now reaching a different, more localized audience.
“It’s much more diffuse and potent, and can be spread faster,” said Mark Bray, a volunteer who was standing under a cardboard sign with the word “press” scrawled onto it. “People can get the information without having to go out of their way. You can bump into it on Facebook without having to go to a specific website. In my experience, in just a short period of time its really changed things and brought this whole thing a lot more life.” This is certainly true, and like the movements they seek to emulate, social media’s coverage of the protests and the arrests has uncovered a latent anger felt by many in country that never thought to object until they saw others who held similar beliefs.
What doesn’t necessarily connect are the goals. The press is having a much harder time interpreting Occupy Wall Street than it did with the movements in Tunisia and Egypt. One reason for this is because this time the narrative is unclear.
In the Arab Spring the goal was clear: Overthrow dictatorships that had been in place for decades.
How does one stop corporate greed in the United States, exactly?
But if the outcome isn’t clear — to anyone — the tipping point of frustration seems easy enough to pinpoint.
“It’s capitalism for us, and socialism for the capitalists, I think is a fair way to describe it,” Michael Lewis, financial reporter and author ofBoomerang, toldCBS Sunday Morning.
“Remembering that I'll be dead soon is themost important toolI've ever encountered to help me make the big choices in life. Almost everything — all external expectations, all pride, all fear of embarrassment or failure — these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.” — Steve Jobs, at a Stanford University commencement ceremony in 2005.
Time and time again, Jobs lived up to these words. He was an innovator, a phenomenon of the personal computing revolution. Jobs was a driving force behind bringing the PC into the home, and became the man to shrink it down and make it portable.
A true purist at his core, Jobs endlessly strove for product perfection in order to deliver exactly what consumers wanted. And as heonce famously said, “People don’t know what they want until you show it to them.”
To try and understand what made Steve Jobs a visionary, Wired.com takes a look back at the life of Steve Jobs, the man.
Jobs was destined to be a tinkerer from the start. Shortly after his birth in 1955, he became the adopted son of Paul and Clara Jobs, the former a machinist by trade and the latter an accountant. He was a bright child, though excitable (one teacher had to“bribe” him with candyto regularly attend her grade school classes).
After the family moved to Palo Alto in his adolescence, Jobs enrolled in a high school electronics course. The course led to Jobs landing a summer gig at the Hewlett-Packard factory, and would eventually introduce him to a friend who would soon shape his computing future for years to come.
Steve Jobs first met the man who would become his longtime collaborator and business partner, Steve Wozniak, through a mutual friend in Jobs’ high school electronics class. Woz and Jobs soon became fast friends, and together began attending meetings of the Homebrew Computer Club.
Eventually, Jobs took a stint as a technician for Atari, and worked with Wozniak on optimizing the circuit board for the gameBreakout.
Woz eventually spun his work with HP into a full-time job. But when Woz found that the Homebrew hacker-types were more interested in his hardware computer designs than HP was, Jobs saw an opportunity.
Jobs and Woz founded Apple Computer on April 1, 1976, selling their own systems based on Woz’ designs to other computing geeks like themselves. “Steve was very fast-thinking and wanted to do things,”Wozniak once said. Jobs”always wanted to be an important person,”according to Woz, and he “wanted to do it by having a company.”
"I was lucky toget into computerswhen it was a very young and idealistic industry. There weren't many degrees offered in computer science, so people in computers were brilliant people from mathematics, physics, music, zoology, whatever. They loved it, and no one was really in it for the money," Jobs toldFortune.
Apple skyrocketed. In the four years between the company's inception in a garage to its 1980 IPO, Jobs went from virtual unknown to being worth hundreds of millions of dollars.
Jobs' name soon became a symbol of the up-and-coming, technologically savvy entrepreneur. His face was everywhere Apple’s brand could be seen, from Apple’s own advertising to the cover ofTimemagazine.
But Jobs was a sometimes controversial leader. During the middle of Jobs’ tenure, it was rumored he gave preferential treatment to the Macintosh computer team — a product he considered one of his greatest creations — risking alienating other product teams at Apple.
Amid internal political struggles and high tension with board members, in 1985 Jobs was fired from the very company he helped create.
While the subsequent years would prove tumultuous for Steve’s professional career, his personal life seemed to thrive. After delivering a lecture to a class of Stanford MBAs, Jobs met Laurene Powell, the woman who would eventually become his wife.
“'I was in the parking lot [after the lecture], with the key in the car,” Jobs said. “I thought to myself, If this is my last night on earth, would I rather spend it at a business meeting or with this woman? I ran across the parking lot, asked her if she'd have dinner with me. She said yes, we walked into town and we've been together ever since.''
Shortly after they were married, Powell was pregnant with the couple’s first child, Reed Paul, named after Jobs’ alma mater (Reed College, at which he only spent one full semester) and his father.
"I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life,” Jobshas famously said of the time.
Jobs used that creative momentum to form NeXT Computer, which despite multiple subsequent failures, Apple eventually acquired. Jobs then became the de facto CEO of the company.
And so Jobs was able to return to Apple rejuvenated, re-inspired, and ready to “Think Different,” as their marketing campaign of the time professed. Although directed toward consumers, thefull textof the campaign seemed to echo Jobs' personal beliefs, almost a toast to who Jobs was. It read:
Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them. About the only thing you can’t do is ignore them. Because they change things. They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.
Jobs even foreshadowed Apple’s future game-changing directional shift toward mobile computing in a statement toFortunein 1996. “If I were running Apple, I would milk the Macintosh for all it’s worth — and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.”
"Myjob is to not be easy on people.My job is to make them better. My job is to pull things together from different parts of the company and clear the ways and get the resources for the key projects. And to take these great people we have and to push them and make them even better, coming up with more aggressive visions of how it could be."
In 1998, Apple debuted the first iMac, a distinct departure from the "un-innovative beige boxes" of competitors, and the first in a string of innovative products that brought a degree of cool back to computing. Jobs also immediately canned a number of flailing projects such as the Newton and Cyberdog.
Instead, Jobs tasked the company to work on software, usingNeXTSTEPto develop what would become Mac OS X. Jobs said, “We realized that almost all — maybe all — of future consumer electronics,the primary technologywas going to be software. And we were pretty good at software.”
Jobs was endlessly dedicated to creating products that were not only functional; they were fun, sexy. Apple’s products are well designed, from the user interface, to the industrial design, down to every minuscule detail.
“In most people’s vocabularies, design means veneer. It’s interior decorating. It’s the fabric of the curtains of the sofa. But to me, nothing could be further from the meaning of design. Design is the fundamental soul of a human-made creation that ends up expressing itself in successive outer layers of the product or service.”
Steve
Jobs has made technology accessible to everyone, not just technophiles,
nerds and developers. You don’t need to know how to program; you don’t
have to install extra software or custom ROMs. The products just work,
just as they should, straight out of the box.
When theiPhone was first announcedon
Jan. 7, 2007, and was finally available to consumers in June of that
year, the world changed. Jobs ushered in the smartphone era for the
masses.
“It’s taken thepain out of personal computingin
so many different ways,” says Leander Kahney, editor and publisher of
Cult of Mac (and a former Wired.com managing editor and news editor).
The
iPhone, iPod and iPad are so intuitive, children as young as one or two
use them easily, swiping to unlock, tapping open their favorite app,
and playing a game or their favorite song. The user interface and
experience is just straightforward.
And one more thing...
“This will be themost important thingI’ve ever done,” Steve Jobs said prior to the unveiling of Apple’s tablet, the iPad. He admitted to havingworked on the devicesince the early 2000s, before they’d even thought of developing the iPhone.
The iPad may be one of Jobs’ greatest technical legacies, but it’s the transformation in the way we now think about interacting with data, and the way that we think about mobile computing, that is even more significant.
Jobs’ legacy is post-PC. We can now hold a fully functional computer in our hands, in our pocket, and use it for communication, learning, work, play or creation. Post-PC is what he was always working toward, what he hinted at when he said that Apple should work on “the next big thing” after the Mac when he jumped back on board with the company in 1996.
“It’s in Apple’s DNA thattechnology alone is not enough. That it’s technology married with liberal arts, married with the humanities, that yields us with the results,” Jobs said at Apple’s iPad 2 event. “And nowhere is that more true than in these post-PC devices.”
It is Jobs' appreciation for and understanding of the liberal arts and humanities that makes Apple devices stand out from the crowd. From the beginning, with theMacintosh 128K’s graphical user interface, mouse, word processor, and paint program, Jobs tried to bring computing to the mainstream, nontechnical crowd.
"The reason that Apple is able to create products like iPad is because we always try to be at theintersection of technology and liberal arts, to be able to get the best of both."
Steve Jobs announced that he had been diagnosed with a rare form of treatable pancreatic cancer in 2004. He firstfound out about the tumorduring a routine check-up in October 2003, and after discovering it was curable, tried nine months of alternative methods to treat the cancer. When that failed, he had surgery to remove the tumor, and sent out a company-wide memo, a portion of which is below.
"I have some personal news that I need to share with you, and I wanted you to hear it directly from me. I had a very rare form of pancreatic cancer called an islet cell neuroendocrine tumor, which represents about 1 percent of the total cases of pancreatic cancer diagnosed each year, and can be cured by surgical removal if diagnosed in time (mine was). I will not require any chemotherapy or radiation treatments.''
In 2008, Bloombergaccidentally publishedJobs’ obituary, which prompted Jobs to poke a little fun during Apple’s September event with aslide that read, “The reports of my death are greatly exaggerated.”
But although his surgery was successful, Jobs was beginning to look ill. Jobscited a hormone imbalanceas the reason for his increasingly emaciated appearance. He ended up taking a five-monthleave of absencein 2009 shortly after that announcement, saying, “During the past week I have learned that my health-related issues are more complex than I originally thought."
Jobs took another medical leave of absencestarting earlier this year. Both times he left, then-COO Tim Cook took charge of day-to-day operations of the company.
In August 2011, Steve Jobs officially stepped down as CEO of Apple and took the position of Chairman of the Board. He wrote to his company in a brief resignation letter:
“I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple’s CEO, I would be the first to let you know. Unfortunately, that day has come.”
Steve Jobs was a singular leader and innovator. And we’ll continue to see his legacy live on in his products and the company he started in a Palo Alto garage over 30 years ago.
Steven Paul Jobs, 56, died Wednesday at his home with his family. The co-founder and, until last August, CEO of Apple Inc was the most celebrated person in technology and business on the planet. No one will take issue with the official Apple statement that “The world is immeasurably better because of Steve.”
It had taken a while for the world to realize what an amazing treasure Steve Jobs was. But Jobs knew it all along. That was part of what was so unusual about him. From at least the time he was a teenager, Jobs had a freakish chutzpah. At age 13, he called up the head of HP and cajoled him into giving Jobs free computer chips. It was part of a lifelong pattern of setting and fulfilling astronomical standards. Throughout his career, he was fearless in his demands. He kicked aside the hoops that everyone else had to negotiate and straightforwardly and brazenly pursued what he wanted. When he got what he wanted — something that occurred with astonishing frequency — he accepted it as his birthright.
If Jobs were not so talented, if he were not so visionary, if he were not so canny in determining where others had failed in producing great products and what was necessary to succeed, his pushiness and imperiousness would have made him a figure of mockery.
But Steve Jobswasthat talented, visionary and determined. He combined an innate understanding of technology with an almost supernatural sense of what customers would respond to. His conviction that design should be central to his products not only produced successes in the marketplace but elevated design in general, not just in consumer electronics but everything that aspires to the high end.
As a child of the sixties who was nurtured in Silicon Valley, his career merged the two strains in a way that reimagined business itself. And he did it as if he didn’t give a damn who he pissed off. He could bully underlings and corporate giants with the same contempt. But when he chose to charm, he was almost irresistible. His friend, Heidi Roizen, once gave advice to a fellow Apple employee that the only way to avoid falling prey to the dual attacks of venom and charm at all hours was not to answer the phone. That didn’t work, the employee said, because Jobs lived only a few blocks away. Jobs would bang on the door and not go away.
For most of his 56 years, Steve Jobs banged on doors, but for the past dozen or so very few were closed to him. He was the most adored and admired business executive on the planet, maybe in history. Presidents and rock stars came to see him. His fans waited up all night to gain entry into his famous”“Stevenote” speeches at Macworld, almost levitating with anticipation of what Jobs might say. Even his peccadillos and dark side became heralded.
His accomplishments were unmatched. People who can claim credit for game-changing products — iconic inventions that become embedded in the culture and answers to Jeopardy questions decades later — are few and far between. But Jobs has had not one, not two, butsixof these breakthroughs, any one of which would have made for a magnificent career. In order: the Apple II, the Macintosh, the movie studio Pixar, the iPod, the iPhone and the iPad. (This doesn’t even include the consistent, brilliant improvements to the Macintosh operating system, or the Apple retail store juggernaut.) Had he lived a natural lifespan, there would have almost certainly been more.
A note left outside the Apple Store in San Francisco on Wednesday night. (Photo: James Merithew)
Behind any human being is a mystery: What happened to make him …him? When considering extraordinary people, the question becomes an obsession. What produces the sort of people who create world-changing products, inspire by example and shock by justified audacity, and tag billions of minds with memetic graffiti? What led to his dead-on product sense, his haughty confidence, his ability to simultaneously hector and inspire people to do their best work?
His gene pool was intriguing. His biological parents were Abdulfattah John Jandali, a Syrian immigrant; and a graduate student named Joanne Simpson. Unmarried when her son was born on February 24, 1955, Simpson gave him up for adoption. She later married Jandali and had another child, award-winning novelist Mona Simpson. Jobs grew up in a middle-class suburb with two loving parents, Paul and Clara Jobs. (He had a sister, Patti, who survives him.) Though he did make a successful effort to find his birth mother, he never seemed to warm to the theory that his drive was a subconscious reaction to a conjectured rejection. He always spoke highly of the family that raised him. “I grew up at a time where we were all well-educated in public schools, a time of peace and stability until the Vietnam War got going in the late sixties,” he said.
The turmoil in those sixties was also part of his make-up. “We wanted to more richly experience why were we were alive,” he said of his generation, “not just make a better life, and so people went in search of things. The great thing that came from those that time was to realize that there was definitely more to life than the materialism of the late 50’s and early sixties. We were going in search of something deeper.”
He went to Reed, a well-regarded liberal arts school known as a hippie haven, but dropped out after a semester, choosing to audit courses informally. (Including a class on calligraphy that would come in very handy in later years.) Jobs also took LSD in those years, and would claim that those experiences affected his outlook permanently and positively. After leaving Oregon, he traveled to India. All of these experiences had an effect on the way he saw the world — and the way he would make products to change that world.
Jobs usually had little interest in public self-analysis, but every so often he’d drop a clue to what made him tick. Once he recalled for me some of the long summers of his youth. I’m a big believer in boredom,” he told me. Boredom allows one to indulge in curiosity, he explained, and “out of curiosity comes everything.” The man who popularized personal computers and smartphones — machines that would draw our attention like a flame attracts gnats — worried about the future of boredom. “All the [technology] stuff is wonderful, but having nothing to do can be wonderful, too.”
In an interview with aSmithsonianoral history project in 1995, Jobs talked about how he learned to read before he got to school — that and chasing butterflies was his passion. School was a shock to him — “I encountered authority of a different kind than I had ever encountered before, and I did not like it,” he said. By his own account he became a troublemaker. Only the ministrations of a wise fourth grade teacher — who lured him back to learning with bribes and then hooked him with fascinating projects — rekindled his love of learning.
Meanwhile, his dad, Paul — a machinist who had never completed high school — had set aside a section of his workbench for Steve, and taught him how to build things, disassemble them, and put them together. From neighbors who worked in the electronics firm in the Valley, he learned about that field — and also understood that things like television sets were not magical things that just showed up in one’s house, but designed objects that human beings had painstakingly created. “It gave a tremendous sense of self-confidence, that through exploration and learning one could understand seemingly very complex things in one’s environment,” he told theSmithsonianinterviewer.
After his call to Packard, Jobs worked at HP as a teenager. He later had a job at Atari, when the video-game company was just getting started. Yet he did not see the field as something that would satisfy his artistic urges. “Electronics was something I could always fall back on when I needed food on the table,” he once told me.
That changed when Steve Jobs saw what a high-school friend, Steve Wozniak, was doing. Wozniak was a member of the Homebrew Computer Club, a collection of Valley engineers and hangers-on who were thrilled at the prospect of personal computers, which had just become possible with the advent of low-cost chips and electronics. “Woz” was among several of the group who designing their own, but he had no desire to commercialize his project, even though it was groundbreaking in simplicity and also was one of the first to include color graphics.
When Jobs saw his friend’s project, he wanted to make a business. While other home-brewers were also starting companies, Jobs was unique in understanding that personal computers could appeal to an audience far beyond geeks.
“If you view computer designers as artists, they’re really into more of an art form that can be mass-produced, like records, or like prints, than they are into fine arts,” he told me in 1983. “They want something where they can express themselves to a large number of people through their medium, and their medium is technology and manufacturing.” Later he would refine this point of view by talking about Apple as a blend of engineering and liberal arts.
The most visible manifestation of this was the elegant case that housed the Apple II. Jobs paid a fledgling industrial designer named Jerry Manock $1,500 to design a plastic case with an earthy beige. (Manock wanted to be paid in advance because, he told author Michael Moritz, “They were flaky-looking customers and I didn’t know if they were going to be around when the case was finished.” Jobs talked him into waiting for his payment.)
“He told me about the prices he was getting for parts, and they were favorable to the prices HP was paying,” his friend Alan Baum said.nJobs would make these deals while Woz and a small team of teenage engineers worked in the Jobs family garage. Every so often Jobs would drop by and impose his views on the project. “He would pass judgment, which is his major talent, over the keyboards, the case design, the logo, what parts to buy, how to lay out the PC board so it would look nice, the arrangement of parts, the deals we chose … everything,” said Chris Espinosa, one of the original group. One other thing Jobs did was convince Wozniak to quit his job at HP and work full time for Apple. When Woz originally demurred, Jobs called all of Woz’s friends and relatives, putting so much pressure on that the gentle engineer capitulated. Once again, Jobs had gotten what he wanted.
Jobs gave thought to what kind of company he wanted Apple to be — once he told me his wish was to create “a $10 billion company that didn’t lose its soul.” He would call up the premier CEOs of Silicon Valley — Andy Grove, Jerry Sanders — and ask them if they would take him out to lunch so he could pick their brains. He later realized that he and Woz were an object of curiosity to people because they were so young. “But we didn’t think of ourselves as young guys,” he said. “We didn’t have a lot of time to philosophize,” he told me. “We were working 18 hours a day, seven days a week — having fun.”
People gathered at the Apple Store in San Francisco Wednesday night to light candles and leave flowers and notes in memory of Steve Jobs. (Photo: James Merithew)
The Apple II was a hit, and so was the company. But unlike Bill Gates, who founded Microsoft in the same period, Jobs did not run Apple. Realizing that his company might go farther if run by professional management, and not a barefoot 22-year-old with a Fidel beard and an abrasive personality, Apple hired a chief executive for adult supervision. Over the next few years, Apple became the most popular of the small field of personal computers, and on Dec. 12, 1980, Apple held an IPO. It was highly unusual for a company that young to do so, but it turned out to be the biggest holding that mantle until IBM entered the field in late 1981.
As Apple became a larger business, Job was somewhat adrift. “The question was, ‘How do I go about influencing Apple?’” he explained in 1983. “Well, I can run around telling people things all day, but that’s not going to result in what I really want. So I thought a really good way to influence Apple would be by example — to be a general manager here at Apple.”
In 1979, as part of the efforts to develop a more advanced machine called the Lisa, Jobs led a team of engineers on an excursion to Xerox PARC. He later described it as “an apocalypse.” He immediately declared that the principles of the Xerox Star — mouse-driven navigations, windows, files and folders on the screen — be integrated into Lisa, an effort which jacked up the cost of the machine almost five-fold. But Jobs’ management style consistently offended the Lisa team, and he looked elsewhere in the company for a group to lead. He found what he was looking for in a skunkworks project off the campus led by a talented computer scientist named Jef Raskin. The small team was working on a low-cost computer to be called Macintosh. “When Steve started coming over, Jef’s dream was shattered on the spot,” said Mac team member Joanna Hoffman.
The Macintosh was a turning point for Jobs, who worried about being branded as the guy who founded Apple, but not much more. Jobs was a relentless, even punishing leader. But his passion earned him the loyalty of the small young team. He encouraged them to think of themselves as rebels. “It’s better to be pirates than to join the Navy,” he told them. A skull and bones flag flew on their office building.
While the Lisa was inspired by the Xerox’s “graphical user interface,” Macintosh took it a step farther. It worked with even more simplicity, was faster, and had a distinctive shape — inspired by the Cuisinart food processor, an appliance Jobs admired. When I interviewed Jobs about the Macintosh in November 1983, he explained to me that while the Lisa team wanted to make something great, “the Mac people want to do somethinginsanelygreat.”
During that interview I asked Jobs for an explanation on why he sometimes gave harsh, even rude assessments of his employee’s work. (Though in some respects Jobs became more mellow later in life, such blunt criticism became a trademark.) “We have an environment where excellence is really expected,” he said. “What’s really great is to be open when [the work] is not great. My best contribution is not settling for anything but really good stuff, in all the details. That’s my job — to make sure everything is great.” Even though Jobs made life hell at times for the brilliant young engineers of the Mac team, they generally regard the experience as the highlight of their professional careers, a magic moment. And indeed, the Macintosh experience provided a template for the culture of many startups, down to the lavish perks provided to the workers.
On Jan. 24, 1984, Jobs publicly unveiled the Macintosh. A night earlier, a stunning, cinematic Super Bowl ad for the computer galvanized the nation; many consider it the greatest commercial in history. The Mac was a sensation. It also cemented Jobs as a national figure, featured with major features inNewsweekandRolling Stone. (Though he was disappointed thatRolling Stonedid not put him on the cover. Jobs actually called publisher Jann Wenner to plead his case. Wenner told him, “Don’t hold your breath.” lI said ‘All right, but you ought to think about this more,’l Jobs futilely recounted. Later, Jobs’ demands for magazine covers would be eagerly accommodated.)
The Macintosh was arguably the most important personal computer in history. It introduced a style of computing that persisted for decades (sadly for Apple, most people experienced the graphical user interface via Microsoft Windows computers, not Macintosh.) It made computers sexy.
But the Mac did not initially sell as well as expected. This failure, as well as Jobs’ managerial shortcomings, put Jobs in jeopardy at the company he founded. For several weeks, he conducted a backroom battle with John Sculley, the former CEO of Pepsi he had personally recruited to run Apple in 1983. (Jobs had famously challenged Sculley by asking, “Do you really want to sell sugar water for the rest of your life?”) But Sculley outmaneuvered Jobs by winning the backing of the board. And on May 31, 1985, he fired Steve Jobs.
The ouster was cathartic for Jobs. “You’ve probably had somebody punch you in the stomach and it knocks the wind of you and you can’t breathe. That’s how I felt,” he toldNewsweek. But he regained his breath by starting Next, a company that designed and sold next-generation workstations. The Next computer, a striking jet-black cube, never caught on (though Tim Berners-Lee would write the code for the World Wide Web on it), but its innovative operating system turned out to be of lasting value, and Jobs kept the company going as a software concern.
During those years, Jobs took on a second company besides Next. A struggling computer graphics studio founded by George Lucas was looking for a white knight, and Steve Jobs took the role. It was to be called Pixar. Under Jobs’ guidance, Pixar morphed from a software company into a movie studio. It produced the first full-length computer-animated feature, “Toy Story,” the first of a series of monster hits for the studio.
Running Pixar was a step in Jobs’ growing maturity. He was wise enough to focus on the deal-making and let the creative movie-makers, like director John Lassiter, do their work. He also got valuable experience in Hollywood. Eventually, he sold Pixar to Disney in 2006 for $7.4 billion.
But it was that other company, Next, that brought Jobs back to the company he co-founded. Apple needed a powerful new operating system, and the Next could provide one. Apple bought Next, but its troubles went far deeper. People were writing the company’s corporate obituary. In 1997, the board of directors fired CEO Gil Amelio and turned to one of its founders to revitalize the company. One of the first things he did was forge a deal with Apple’s blood rival, Microsoft.
While Jobs emphatically stated that he was only filling an interim role at Apple — “I hope we can find a terrific CEO tomorrow,” he said that August — he took to it so enthusiastically that it was no surprise that he removed the lowercase “i” from his iCEO title in 2000. By then he had made Apple profitable again.
A turning point was his introduction of the iMac in May 1998. Almost a year after taking control of Apple, Jobs called me and invited me to spend a few days with him as he launched his first big project. I got a glimpse of the exacting preparations he makes for a launch, monitoring every detail. (He nixed the sound of a clarinet on a video soundtrack to a clip because it sounded “too synthetic.”) When an employee showed him some work at one point he said simply, “This is a ‘D,’” and turned away. But at the launch itself, he was the picture of poise.
The iMac was a huge success, an all-in-one machine that sent the message that simplicity, beauty and power would be behind Apple’s comeback. He also simplified Apple’s product line to four computers — consumer and pro versions of desktop and laptop. “Focus does not mean saying yes, it means saying no,” he explained. “I was Dad. And that was hard.”
But with each iteration of computers, Apple was gaining fans. The one exception was Jobs’ introduction of a monitorless machine called the Cube. It was perhaps the most beautiful computer ever. But in this case, Jobs let his aesthetic instincts overwhelm his sense of the marketplace. It was a rare failure.
In 2000, he explained how competitors still didn’t understand Apple’s mix of art and science. “When people look at an iMac, they think the design is really great, but most people don’t understand it’s not skin deep,” he said. “There’s a reason why, after two years, people haven’t been able to copy the iMac. It’s not just surface. The reason the iMac doesn’t have a fan isengineering. It took a ton of engineering and that’s true for the Cube and everything else.”
In October 2001, Apple introduced a music player, the iPod. It broke ground as the first successful pocket-size digital music player. Because Jobs had a tremendous ability to locate and hire brilliant talent, his team produced it in less than a year. The process is indicative of the way Apple ran. Though Jobs could be overwhelming in pushing his point, he understood that ultimately, his products would not work if their best ideas were discarded. In the case of the iPod, hardware designer Tony Fadell knew how to get his best prototype approved by Jobs — he showed his boss three different designs, with one clearly superior, to give Jobs a chance to berate two efforts before saying, “That’s more like it!” with the last.
Sometimes, Jobs would dig in and only back down when the marketplace spoke. Again, the iPod was an example. Originally, he felt that the iPod should only work with Macintosh’s computers. But its instant popularity led him to agree with some of his employees who had been arguing for a Windows version. When iPod became available to the entire population, it really took off. Apple has sold over 300 million iPods.
“If there was ever a product that catalyzed what’s Apple’s reason for being, it’s this,” Jobs said to me of the iPod, “Because it combines Apple’s incredible technology base with Apple’s legendary ease of use with Apple’s awesome design… it’s like, this is what we do. So if anybody was ever wondering why is Apple on the earth, I would hold this up as a good example.”
What’s more, to support the iPod, Jobs began the iTunes music store, the first successful service to legally sell music over the internet. Though the record labels were notoriously conservative about such deals, “They basically trusted us and we negotiated a landmark deal,” Jobs told me. The iTunes store would sell billions of downloaded songs.
The iPod was a turning point for Apple and Jobs. Competitors never figured out how to top it. Every year, he would come out with a new set. One year he stopped selling the most popular model, the iPod mini, for a totally new model called the Nano. The product line would be laid out on a table. He’d talk about which color he liked best. Often he’d pick one up.Isn’t that amazing?
This satisfied him deeply because Jobs loved music. His heroes were Bob Dylan and the Beatles. I once asked him if his dream was to get Paul McCartney to perform one of those sweet two-song live sets that often close his keynotes. “My dream,” he joked, “is to bring outJohn Lennon.”
While Jobs reveled in his professional spotlight, he was more circumspect about his private life. He distrusted the most reporters, ever since a 1982Timearticle mocked his pretensions and exposed his darker side. Jobs, who thoughtTimewas going to make him Man of the Year (it chose “the personal computer” instead) was wounded. “I don’t mind if people don’t like me,” he said in late 1983. “Well, I might a little…but Ireallymind it when somebody uses their position atTimemagazine to tell 10 million people they don’t like me. I know what it’s like to have your private life painted in the worst possible light in front of a lot of people.” Twenty years later, he would still be complaining about that article. (The writer, Michael Mortiz, later became a powerful venture capitalist, funding Yahoo and Google.) But Jobs would not comment on subsequent accounts of his life that detailed not only rude professional behavior but his original refusal to support his first child (later he accepted paternity).
Jobs was a proud, proud father of four children, three from his marriage to Laurene Powell. He was protective of them — whenever he shared a story about one of his children in an interview, he cautioned that the remark was to be off the record. (His widow and all four offspring survive him.) But he clearly took a huge pride in parenthood.
It was July 2004 when Steve Jobs learned he had a rare form of pancreatic cancer. He originally treated the disease without sharing much about it to the public. Critics wondered whether Jobs and Apple had skirted corporate disclosure regulations by not revealing more information. After what seemed to be a successful initial surgery, Jobs would vary from his circumspect stance just once, in his address to the Stanford graduating class of 2005. That speech, by the way, might be the best commencement address in history. When designing computers, Jobs and his team built the one they wanted for themselves. And now he gave a speech that Steve Jobs would have wanted to hear if he had graduated from college.
“No one wants to die, even people who want to go to Heaven don’t want to die to get there,” he told the Stanford graduates. “And yet, death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new … Your time is limited, so don’t waste it living someone else’s life.”
Steve Jobs never did that. After his cancer treatment, he took Apple’s biggest risk yet — developing a phone. Of course, it would not be just any mobile phone, but one that combined the media savvy of the iPod, the interface wizardry of the Macintosh, and the design style that had become his trademark.
As with all his products, Jobs was fanatical in monitoring every detail — including the press reaction. I was among the few journalists who got to test it before its release. Soon after I received the unit, I was walking down Broadway and my test unit got a call from “Unknown.” It was Jobs, ostensibly wanting to know what I thought, but actually making sure I understood how amazing it was. I acknowledged that it was extraordinary, but mentioned to him that maybe nothing could match the expectations he had generated. People were calling it the “Jesus phone.” Didn’t that worry him? The answer was no. “We are going to blow away the expectations,” he told me.
The iPhone did just that — especially after Jobs put aside his initial view that only a limited number of developers would be permitted to write applications for it. Apple’s App Store eventually included hundreds of thousands of programs, giving Apple a key advantage. As Apple’s current CEO boasted only Tuesday, the iPhone is the world’s most popular phone.
In 2008, observers noted that Jobs had lost an alarming amount of weight, and looked ill. People wondered whether the cancer had reoccurred. In what looks in retrospect to be misdirection, Apple released a statement calling it a “bug.” When I ran into him in Palo Alto in that time period, Jobs brought up the subject, elaborating in detail about how he was suffering a temporary malady unconnected with this cancer. But he got thinner, and seemed weaker, and took a leave of absence.
Despite his health problems, Jobs kept Apple on a steady pace of innovation. When he returned to Apple — after a liver transplant which was acknowledged only months later — his first appearance was an iPod event. “This is nothing,” he told me after the show. “Wait till you see what’s next.”
He was talking about the iPad, the tablet computer that he introduced in April 2010. Expanding on the touch-based interface of the iPhone, Jobs had pulled off a vision of computing that many (including his rival Microsoft) had been attempting for decades. The iPad instantly established tablet computing as a major category, and as with the iPod, competitors could not match it.
Earlier this year, he took a second medical leave of absence. Tim Cook, the operational wizard who had been appointed Chief Operating Officer, would become the temporary CEO. Jobs would still be involved in product design and strategic direction, but freed of everyday responsibilities.
Jobs came and went to Apple as he was able, driven by a town car to One Infinite Loop in Cupertino, centerpiece of the campus of the company he built, only a few blocks where he had gone to school. He would walk past the receptionist and take the elevator to his fourth-floor suite that included his office, a small staff, and a large boardroom where he had overpowered music executives, raked employees over the coals, and approved products that millions adored. With no daily chores to perform, no crowded appointment book, there could be a strange and tranquil sense of timelessness, even as he helped shape products in progress, and dreamed up new ones.
It seemed Jobs had come to terms with his fate. He would spend time with his family and do what he could at Apple.
In June he gave his last “Stevenote,” talking about iCloud. One could have hoped that he would give many more. But on August 24, he sent a note to Apple’s board that he could not resume the CEO role.
He took the role of executive chair and reported that he would continue to participate in product decisions and strategy. But clearly he was headed towards the end that came today, quietly surrounded by the people who loved him and knowing that many millions of people who never met him would miss him desperately. As he told the Stanford students:
Death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new.
The full legacy of Steve Jobs will not be sorted out for a very long time. When employees first talked about Jobs’ “reality distortion field,” it was a pejorative — they were referring to the way that he got you to sign on to a false truth by the force of his conviction and charisma. But at a certain point the view of the world from Steve Jobs’ brain ceased to become distorted. It became an instrument of self-fulfilling prophecy. As product after product emerged from Apple, each one breaking ground and changing our behavior, Steve Job’s reality field actually came into being. And we all live in it.
Hello!!! I am Youbitto The Selcouth From Algeria , I have nothing special on my pocket but you know maybe its hidden so its the same for all of us , I like networking and stuff so much and want to learn everything related to computers and the new technology.