Interview: Jim Rygiel (Three times Academy Award winner for LOTR)

Now, as I was posting my VFX story (right below this one), I remembered an interview that I had done with Jim Rygiel, who was the VFX supervisor for the LOTR trilogy. It was my biggest international story at that time. And it was an amazing feeling interacting with Rygiel, as I am a great fan of his work, especially in LOTR. Not to miss out the fact, that he was very candid and would not shy away from calling a spade a spade. He even spoke about the future of Indian VFX. The story was published on CIOL: (http://www.ciol.com/content/news/2006/106030604.asp)
*************************
The Lord of Visual Effects on VFX!

The excitement came back in the movies sometimes in the early eighties, a time when Star Wars, Jaws, E.T., scared, spooked and thrilled audiences worldwide. Steadily computer took its rightful place in movie industry, making the impossible very much possible.

Over the years, the aliens became creepier, the monsters monstrous and extraordinary events became more lifelike. Real and Big are the two words that come to mind when one thinks of computer generated (CG) effects, everything seemed so very real and that too on a big scale. And the Lord of the Rings (LOTR) trilogy epitomizes the progress of visual effects industry, it is a landmark, something that can be compared to Armstrong's landing on the moon. Jim Rygiel was the visual effects supervisor for the three films directed by Peter Jackson. He was awarded the 'Best Visual Effects' Academy Awards (or Oscars) for three consecutive years for LOTR, a record of sorts.

Jim started his career in the 1980 by joining Pacific Electric Pictures, one of the earliest companies to employ computer animation for the advertising and film markets. He has worked as a visual effects supervisor in films like Species, Outbreak, Air Force One, Cliffhanger, Batman Returns, Alien III, Ghost, Anna and the King, 102 Dalmatians and of course the LOTR trilogy. He holds a degree in architecture and a master of fine arts (MFA) degree. He is currently working on films like Click. Jim speaks about how visual effects industry is shaping up, what goes behind the screen (especially in the LOTR) to Shashwat Chaturvedi from Cyber Media News in an email interview. Excerpts.

You are from the fine arts background, how does visual effects fit into your profile? In visual effects, one is constantly trying to make the fantastical as believable as possible, more monsters, more real dragons, etc. Would you refer to visual effects as an art form?
I have always had the concept that anything I do in life is art. The craft and aesthetic of painting a house, cooking a fantastic diner, or even doing ones finances takes special talents and skills that is crafted over time (is like art). Similarly, visual effects is a combination of art, science, mechanics, and even some politics!

The Lord of the Rings trilogy has been heralded as the hallmark movies of the generation, how hard was it to make it a reality? What was the single most arduous task involved?
It was interesting working on Lord of the Rings, when I first approached it I didn't know what to expect. I knew that I could not possibly do the movie in New Zealand, and planned to take it all back to Los Angeles. However within the first 45 minutes after landing, I was swayed with the technical advancements and filmmaking skills that Peter Jackson had built in New Zealand. The biggest task was to get the entire crew moving forward in production mode.

In LOTR there was a lot of innovative technology that was used, right from motion animation, keyframing to the usage of proprietary software Massive (developed by Weta Digital). Can you briefly describe the various techniques that were used and what were the challenges in implementing them?
Well, we always tried to push the technology, we used virtual reality cameras (based on motion capture) to help us pre-visualize sequences. Motion capture ran for about 3 years straight as we had to capture thousands of motion for all of the different characters, in all of the 3 films. The motion capture was then applied to the Massive software, which drove the many different captured cycles. Our miniature dept ran for approximately 1000 days, shooting all of the various pieces, which would be composited with our live action and CG effects.

You have been supervising visual effects in many Hollywood blockbusters like the Last Action Super Hero, Cliffhanger, Batman Returns, Outbreak, etc. How has your experience with LOTR different from these?
Usually when you work on a film there are 1 or 2 different types of effects, for instance in Batman Returns, we did some digital penguins that performed a few moves, and we also did some digital compositing, but it was basically the same thing. With LOTR almost every shot had something different going on with it. Cave trolls, flying fell beasts, Giant Mumakil elephants, miniatures, live action, CG effects, it had it all!

What was the experience of creating Gollum (Speagol)? Was it completely based on motion-capture of actor Andy Serkis?
Gollum was an evolutionary concept, when we first started thinking about him we had a completely different vision for him. He was a bit more alien looking than his current self. As the shooting progressed Peter Jackson felt he needed someone to play the eyeline character for Frodo and Samwise so he got Andy Serkis. Andy began to play the eyeline character, and slowly start to recite the lines back during the actor, Andy honed his part and was eventually reciting all of the lines for Gollum, so Peter said hey, lets use Andy's voice for Gollum since he was doing it so well, as Andy got more into the part, he started hopping around like Gollum, as Peter was editing the film Andy was so amazing as Gollum that Peter asked us to copy his motions as closely as possible because he was perfect for the part. So we did a bit of what we call rotomaton, which is a process where you bend the digital character to match frame by frame to the real character (Andy), we also did some motion capture, and keyframe animation , but I believe the best performance came out through the use of rotomation.

Of the Cave Troll, Gollum, Treebeard, and Shelob, whom do you find the most fascinating and why?
Well, Treebeard and the Ents gave us some struggle in creating the characters, the animators had to bring forth the age-old grandfatherly wisdom of the giant trees, and yet they had to have great strength, and a bit of whimsy. However it was the Balrog in the first film that was the most challenging and fascinating in terms of technical execution.

To get the feeling that 30-foot waves of fire and smoke were emanating through the character, we shot small flame elements and then used a particle system to create the 30-foot wall of fire that moved with and around the Balrog.

What is the software that you used for LOTR? What was the hardware? What were the kinds of innovations involved, like props, etc.? And how big was a challenge to supervise a big and diverse team of visual effects artists?
We used Maya software for our animation, particle systems, and general scene setup. Shake was primarily used for compositing the elements together, we had farms of Intels and IBMs (about 6000 on the last film) for our rendering power, and we used Renderman for the final rendering of the CG effects. The Artists themselves were fantastic, there was a great diversity from around the world. On the last film, I supervised 450 global artists (we called ourselves digital migrant workers) as many of them went from job to job around the globe. Language was a barrier but we all spoke a global CG language (Maya, Shake).

What is your views about the trouble with audience habituation, people are becoming so accustomed to special effects in films like War of the Worlds, Jurassic Park, King Kong, etc. that the effect-wizards are locked in an upward spiral of an endless special-effects arms race, with demands for bigger explosions, uglier villains, and ever expanding battle scenes?
From a job standpoint, I love it as it keeps me employed. But in my book story is always the king! In all too many films we have seen amazing crazy mind bending effects that have either gotten lost or suffered because of the lack of story. I love working on mega blockbusters, as well as the small intimate films, in both types of films my job is the same that is to help the director tell his story.

You have been in the industry for over two decades now, where do you think is the special effects/animation industry headed? What is the road ahead so as to say?
The nice thing about digital filmmaking is that, it is limitless. Anything that can be thought, can be achieved. There will be new venues, audience participation films, virtual reality films, and in the future who knows maybe even holographic films. Things are definitely getting more exciting in the visual effects world.

Tell us about your interactions with Peter Jackson, how was it working with him?
Well its amazing when you are working with a genius, Peter is as nice in person as he is in interviews, he absolutely knows what he wants and lets you know exactly what that is, there is no beating around the bush. He is very attentive to your feedback as an individual. It was truly amazing to work with a man like him.

Which movie has impressed you the most (special effects) and have you seen a film where special effects are too overwhelming and in the end losing the plot?
I guess I would have to say 2001, even though it was very simplistic in its approach, (It was basically flying matte paintings). It was earth shattering at the time in terms of its look, we were brought up on bad 50's sci-fi films and then along comes a movie that actually showed us what living in space is going to be like, it was bringing the mundane-ness of everyday life into space. For instance a Pan-Am spaceship bringing you up to a Howard Johnsons Hotel in space and the AT&T space phone.

The Matrix 2 and 3 became overwhelming for me the effects were fantastic but it became too much of a good thing. I think it was a problem again with the story vs. the effects.

Have you ever seen an Indian movie, if yes, what are you views on the same?
I have never been to India but would love to come over there sometime. I occasionally turn on the Indian channel on television , and watch the spectacle of the Indian film, I have no idea what is going on, but it is quite beautiful to watch all of the dances and costumes.

Indian is renowned for its IT prowess globally; do you think India can ever replicate the kind of work done by Weta in New Zealand?
Yes absolutely! However the machines are nothing, so that makes all of the IT power in the world worthless, unless you can get good artists to run those machines. The aesthetic needs to be addressed, much like the difference in aesthetics between the Indian film and the Hollywood blockbuster. However, I feel that the world is getting smaller and I am actually doing some work in Mumbai, with a company called Frame Flow, I used the same Indian producer for some work on Lord of the Rings.

Feature: State of VFX in India


It has been 25 years since spielberg's E.T. was released and sadly Indian cinema has not really been able to recreate that magic. It has nothing to do with technology, today Indian animators have worked on movies like Spiderman, Superman, Narinia, etc..According to me, there are two real aspects to it; one the vision of the maker and second is the ability. While there has been some improvement on the later, nothing much seems to have changed on the former. Take the case of much-hyped Roshan films (the first one was a copy of ET), they seem to completely take the 'story-telling' aspect for granted. It is all style, no real story. I really don't know when things would change, we need someone like Peter Jackson, who completely revolutionised how VFX was used. He is a true magician put life into inanimate things through the use of computers. Quite sometime back, I had taken a stock of Indian VFX industry on CIOL, what is hapenning and what is possible. It is still valid today....
(http://www.ciol.com/content/news/2006/106012102.asp)
-----------------------------------------------------------------------

Will VFX arrive in India, Day After Tomorrow?



While Hollywood blockbusters like Day After Tomorrow saw a giant tidal wave submerge the Statue of Liberty, Indian films seem to be quite content to blow up a plane or two using computer effects...

Saturday, January 21, 2006


It's evening, Ann Darow, a vaudeville artist and the giant ape are sitting on the tallest peak on Skull Island, gazing at the setting sun. The view from the top is serene and beautiful, a world that is almost magical, too good to be real.

But that world does exist, on one of the workstation at Weta studios in New Zealand, a 'Matrixian' sort of world made of 0s and 1s. Visual Special Effects -- or VFX as it is popularly known -- is constantly blurring the divide between real and surreal, nothing seems to be impossible anymore, as it was in the King Kong movie recently. Computers have brought the magic back in movies.

Right from the very start, filmmakers have been trying to make movies that defy conventional reality and in the process stretching the very limits of technology. George Méliès made the first sci-fi movie, A Trip to the Moon in 1902 inventing something known as trick photography.

The next big thing was the original King Kong made in 1932; Merian Cooper pioneered the use of stop-action model effects. George Lucas' Star Wars in the 1970s opened the realm of possibilities with the use of robotics and computer effects. Steven Spielberg brought to life aliens in E.T., dinosaurs in Jurassic Park, sharks in Jaws and alien machines in War of the Worlds.

Finally, Peter Jackson went a notch higher, the Lord of the Rings trilogy proved what modern high-end computing can achieve. And if that was not enough, he put life in the giant ape King Kong. VFX in Hollywood is getting bigger and bigger by the day, every year big blockbuster movies are released that heavily rely on VFX to pull the audiences.

Bombay Dreams

In contrast, VFX in India is fairly primitive; we have barely achieved in Koi Mil Gaya, what Spielberg had in E.T. way back in 1982! Abhishek De, producer (VFX), Maya Entertainment, is of the opinion that, “we might be late, but we are catching up fast enough.” He cites the example of a film done by his company, Jajantaram Mamantaram, “It had over 90 minutes of special effects, fairly large by even international standards.”

Merzin Tavaria, creative director/VFX supervisor, Prime Focus, says, “VFX in India is coming up in a big way. Over the past 2-3 years, filmmakers are ready to experiment and explore new possibilities. These are ominous signs for the VFX industry in India.” Merzin talks about a project he is currently working on, titled Love Story 2050, he promises it will be a “real biggie” in terms of VFX used.

Color of Money

What are the shackles that bind Indian VFX artists from achieving the same as their brethren in Hollywood? “Cost,” says Abhishek, adding, “a visual effects sequence can cost anything from Rs. 5000 to Rs. 50,000 per second, or more, depending upon the complexity of the shot. Sadly, typical Hindi movie budgets are not in the position to commit that kind of money for VFX.” Thus, most Indian makers are content to just make their heroes do a summersault or leap from a rooftop with the help of VFX, or at best blow up obnoxious aircraft. All this work just about scratches the possibilities that are available with VFX.

According to Merzin it is has more to do with ignorance, “Directors in India are slowly waking up to the possibilities. We are coaxing makers to dream big and creating a market for VFX in India.” Recently, Prime Focus was in the news due to its association with a film Vaah! Life ho to Aisi that reportedly had a VFX budget of Rs. 60 million.

The Rise of the Machine

Are we lacking in technology or machines? Do we have the same hardware as ILM or Weta does? Pankaj Kedia, regional sales manager (South East Asia & India), Autodesk Media and Entertainment Division, assures that we do. “All the latest technology and software used internationally is available in India and the studios here are also adopting them. The Indian market is growing quite fast, in fact it is the largest growing worldwide market for Autodesk solutions,” he says.
Two-thirds of the top grossing films in Hollywood since 1993 have used Autodesk's effects and editing technology; films like Day After Tomorrow, Lord of the Rings, King Kong, etc. Pankaj also believes that there is a growing realization of VFX possibilities in Indian makers, “Sanjay Gadhvi (director of Dhoom) is more convinced about the use of VFX after using it in Dhoom, so we will see a lot more effects in Dhoom 2. Similarly, Rakesh Roshan is also going all out for Krish after experimenting with VFX in Koi Mil Gaya,” he adds.
Yet, not every studio can afford Autodesk's latest tech machines, an Inferno system costs approximately Rs. 30 million, a Flame system close to Rs. 18 million, a Flint system approximately Rs. 8-9 million. According to Pankaj, the VFX biggies in India are: Prime Focus, VCL, Prasad EFX, Maya Entertainment, etc.

A Man Apart

Weta Digital was just another studio in Wellington, New Zealand, a decade or so ago. But that was before a maverick maker by the name of Peter Jackson decided to rewrite history. He embarked upon one of the most ambitious projects of all time; bring to life J.R.R. Tolkien's epic Lord of the Rings. Now, Weta is a VFX powerhouse, giving the best studios in Hollywood a run for their dollar. That's the difference a single man's will does. Do we need a desi Peter Jackson, who could dream big and then make it come true as well? “Of course, that would help,” says Abhishek from Maya, adding, “we indeed need makers for whom cost does not matter, only the vision does.”

The View from Beyond

Jesh Krishna Murthy has worked on films like Batman Returns, Lara Croft Tomb Raider, The Cell, etc. He is currently setting up shop in India, launching a company called Anibrain, targeting Hollywood and the Indian market. He is quite effusive on the subject, “VFX is not about machines or software; it has more to do with pushing the limits. I do not see many people in India doing that. And unless we really push ourselves hard, like developing software solutions, plug-ins, etc., we will never reach the level Hollywood has.”

Another person who is of a similar opinion is Jim Rygiel, three times Academy Award winner for Lord of the Rings (LOTR) trilogy. He was the VFX supervisor for the three LOTR films. He is an industry veteran, having worked in films like Ghost, Cliffhanger, 102 Dalmatians, etc. He is of the view that, Indian companies can replicate the success of Weta Digital, “however machines are nothing, so that makes all of the IT power in the world worthless unless you can get good artists to run those machines. The aesthetic needs to be addressed, much like the difference in aesthetics between an Indian film and a Hollywood blockbuster,” he says.

In conclusion, we have the machines, we have software, and we have the required talent. What we need now is the will: The will and self-belief to do anything.

Feature: CIA's take on computer, IT, Internet and so much more...

Quite recently, CIA declassified a whole set of confidential documents, known as the 'Family Jewels'. It is certainly not the first time CIA has done that, over the years there have been many such occassions. But this time, there was a lot of hype around it. Being an IT journalist, I was quite curious whether the gents at CIA were keeping a track of the way the humble computer evolved? So, I got cracking and was pleasantly surprised to discover that the agency really kept a close tab on the 'wonder of computers'.
Not only that, I chanced upon lot of documents that detailed how the Soviets were failing in their IT designs, or how India was emerging as an IT powerhouse and why China would not really succeed in its modernization efforts. The problem was all these documents were scanned as images (at times quite unreadable), so I had to take a printout of quite many that seemed interesting (I have a fat dossier now) and manually type the text in.
It was quite an effort, as I was a wee bit worried, that some journo sitting across the globe at Time, or even BBC, might be working on the same concept. Anyways, after many 'long nights' the story was done and was published on CIOL. The story gives an idea of not only what CIA thought about computers , but also how it has evolved over the years. I am quite confident, that it makes quite an interesting read....(http://www.ciol.com/content/2170798471.aspx)
------------------------------------------------------------------------------

CIA on computers, technology and Indian IT

Culled from the declassified documents of CIA, here is a look at how technology and computers were shaping our world over the years. Of course there is lot of Bond, James Bond-like action involved...

Wednesday, July 21, 2007

The left-facing bald eagle perches atop a 16-point compass star. Metaphorically, the star denotes search for intelligence and information outside the dominion of the US Additionally, the compass star rests upon a shield, symbolic of defense and fortification. The emblem or the seal of Central Intelligence Agency (CIA), the premier surveillance and security agency of the US government, more or less says it all.

Shrouded in a cloak of secrecy, CIA carries covert work across different continents and countries. Not many know, what all the agency exactly does and there is little that the agency doesn’t, according to the few that happen to know and have spoken out. Established just after the Second World War, in 1947, CIA’s mandate is to obtain and analyze information about foreign governments, corporations, and persons. The agency or the company, as it is often referred to as, is also mandated to effect propaganda and handle public relations for the US administration. And finally, it is also, consigliere and caporegime of the reigning US President; ever ready to do his bidding.

Over the years, there have been many accusations and charges against the CIA, ranging from recruitment of Nazi scientists and SS officers to hiding information about UFOs; experiments on unwitting American citizens to concerted efforts to assassinate Cuban President Fidel Castro. There have also been allegations that the Agency had alliances with opium lords in Burma, Thailand and Laos; an assassination program in Vietnam; complicity in the toppling of Salvador Allende in Chile; recruiting and training Economic hit men, the arming of opium traffickers and religious fanatics in Afghanistan including a certain Osama bin Laden; the training of murderous police in Guatemala and El Salvador; and involvement in drugs-and-arms shuttles between Latin America and the US, the list just keeps going on.

But then, most of these is just speculation, and there is little proof of them. Unless and until the agency itself admits it, albeit obliquely. Over the past few years, the Agency, has been declassifying records under the Freedom of Information Act. These declassified records range from elaborate reports to short summaries on a range of issues, right from Cold War to Desert Storm. All these documents can be accessed from CIA’s website (http://www.foia.cia.gov/), using simple search functionality. There were quite a few documents, dubbed as the ‘Family Jewels’, were declassified recently.

Now, it is a well known fact that the agency has also been has been an avid technology user, it was even supposed to own quite a few U2 planes and reconnaissance satellites. Its obsession with technology is two-fold: one for its own use and second what other nations might be using to get an edge over the US Right from the onset, the agency has been making confidential reports and assessing technology. The big push for the same came in fifties with the launch of Sputnik satellite by comrade Stalin in the erstwhile USSR.

So, how could the agency miss the irrepressible computer, the miracle machine that was also evolving more or less around the same time? It certainly did not, on searching the term ‘computer’ on CIA’s website, one gets around 517 declassified documents, from the oldest on German textile industry using computers as a tool published in August 1945 to the latest on WMD search in Iraq dating from September, 2004.
Studying the documents is like taking a trip down the history, a proverbial time machine. One gets to know what really people thought about the wonder of the machine, what were the possibilities according to the people way back then and how things have changed.

A competent ‘mechanical slave’
In 1960, Joseph Becker authored a paper on “The Computer – Capabilities, Prospects and Implications”. The report talks about how “intelligence negotiates for the services of an obtrusive, demanding, but enormously competent mechanical slave.” That’s how the computer has been defined, a competent mechanical slave. First up, Becker distinguishes between two types of computers, one is the analog and the other is digital. The analog, he tells us, is more or less a mechanical device that “gives measurements of a continuum, notably time, direction, distance, or velocity, processes them mathematically as desired and displays the results in some measurable form.” A speedometer is an example of an analog computer.
A digital computer is like an abacus, he states. Digital computers deal with discrete numbers. “Numbers may be used to represent the layers of the alphabet or verbal symbols, the digital computer is the machine that has the major promise for handling the verbal data if intelligence,” he states.

The paper gives an instance of an digital computer, “Electronic computers like Remington Rand’s UNIVAC or IBM’s 700 series process numbers at speeds measured in millionths of a second, have an immense storage space or “memory” function precisely and accurately, and can process letters of the alphabet when these are numerically coded, treating them internally as if they were numbers,” states Becker as if he was wonder-stuck. In fact, it is this wonder that really strikes you. Don’t we get impressed when we see a humanoid like Aibo doing things, imagine what people must have thought way back then.

Becker talks about how there is a notion that “digital computers are endowed with near-human or even super-human qualities”. And discusses how a seasoned computer operator will argue that on occasion the machine has a ‘personality of its own’ and “his emotional involvement with the machine is such that research being done in man-machine relationships to arrive ay the right mix of human factors for happy and efficient work with a machine as colleague or subordinate.” It seems that the computer back then had a sort of halo to them. But he states emphatically, that the computer do not “think” but are driven through pre-determined set of operations.

In the future, Becker states, these computers would be extensively used in civilian and military work due to their high-reliability and faster processing. He lists out the different component of the EDP system as: input equipment, the computer, storage area, control mechanism and finally output devices. Becker also talks about a ‘recent innovation’ i.e., automatic programming. He talks about IBM’s Fortran language that makes a programmer’s job quite easier, as it contains only “38 statements”.

Becker also talks about how hardware technology continues to forge ahead, with the use of ‘cryogenic techniques and micro-miniaturization of circuitry’. “In the days to come very complex circuitry can thus be built into a cube of stacked wafers the size of a lump of sugar,” he says.

Yet, Becker’s prediction about the future of the digital computer is real attraction. “Looking ahead, designers foresee the day when refrigerated computers the size of portable TV set will operate on wall socket power. It is symptomatic that one of the serious design problems facing the computer engineers is that of minimizing the length of connecting wires, which becomes more and more critical as components get smaller and signal speeds approach the speed of light,” he states.

Looking back, one can say that he was not much off the mark, quite soon, the use of transistors curtailed the need of connecting wires and then of course there were the ICs. In the report, Becker further details the application of the computer and hints at how CIA could benefit immensely from computerization, namely through having a central digitized repository of documents. He also talks about the use of computers for automated translation of documents in foreign languages and to study photos in a better way. He also gives a sort of warning, “if we profit from the experience of industry during the past few years we should be prepared for some radical changes in the organizational structure as a result of the introduction of machines.” It could be that he was talking of labor substitution. All-in-all, the paper is quite detailed in its examination of the mechanized phenomenon, and he sure was not much of the mark.

The clandestine and conniving Soviets
One thing is for sure, after the Sputnik debacle. The CIA kept a hawk eye on all going on behind the iron curtain. Through out the years, there have been numerous reports and assessments on how well or how bad the Soviets were faring. Sometime, they would be quite premonitory in nature, talking about how USSR was rapidly deploying technology and specifically computers to good use. But more often then not, they debunked the Soviet efforts, it must have been quite a relief to the supreme commander at the White House.

One such report from 1977, titled “Soviet RYAD Computers: A Program in Trouble,” talks about how the Soviets were unable to succeed in creating clones of the IBM 360 series. “The USSR continues to experience serious delays in the development production, installation and effective use of its RYAD computers, which form the cutting edge of the Kremlin’s modernization program,” the author states. The reason according to the report was fairly simple, Soviet computers had been designed mainly for scientific applications and had insufficient internal memories.

In the report there is mention of how the Soviets had tired to clandestinely copying from the IBM 360 series. And also they were funding different hardware initiatives across Eastern European, namely in Bulgaria, Poland, East Germany, and others. Even then they were failing, quite miserably so as to say.

That was from the hardware point-of-view, another report goes on to state, how USSR was also trying to source Western software and computer applications that could be replicated at Moscow and other places. And just like the case in hardware, they were failing miserably in software as well. “The leadership of the Soviet software R&D community recognizes -- as revealed in open-source literature – that the USSR’s software industry is technically backward. The Soviets know that, if this problem is not corrected, Western countries will be to extend their domination of advanced information-processing technologies for decades to come,” the author mentions.

Theorizing on possible reasons the author states that the lack of success with hardware might be the reason for the software industry. “The USSR is struggling with limited success to acquire and put into use the computer hardware that would be vital to a modern software industry,” it states. According to the report, to overcome this limitation, the Soviets were trying to source software from Western nations clandestinely. They were trying their hands at data sources codes, simulation software, algorithms for application development, utilities programs, etc. Later on, in another paper, there is a big hint that India might have a role to play in the technology transfer to the communist state.

India; an ally or a threat?
Americans were loath to the idea of the Russians (rather the Soviets) playing around with their machines or software. Under no circumstances did the US government want the Soviets to get hold of IBM machines or software of any kind. Thus for them, every State that had affable relationship with the Soviets was a threat. And India happened to be one such State.
In June 1985, there was a report published under the title, “India: A Growing Role in Technology Transfer.” Even after declassifying it, the agency has censored close to 70 per cent of the report (there is still a lot they don’t want the Indians to know). In the report, the author not only talks about the danger of India passing out the bits, bytes and ICs to the Soviets, but also in a small way captures the change that was taking place at that time.

There is a lot of talk of how prime minister Rajiv Gandhi was technolizing India and introducing a slew of reforms to that end. “Gandhi has introduced sweeping economic reforms that promise to encourage imports of sophisticated western technology and boost India’s ability to manufacture its own high-tech products. As a result, India will offer an increasingly attractive target for Soviet science and technology collectors in the years ahead,” the report warns.

It also talks about how after decades of slow and uneven growth, India had developed sizeable heavy industrial sector and an impressive economic infrastructure, and was poised for an unprecedented expansion into high-tech areas such as electronics and computers.

“We believe the scope for such growth is vast – India boasts a brand range of scientific activity and a large reservoir of highly educated and technically-trained manpower,” the author states. Adding that, “Gandhi is eager to make Indian high tech goods competitive in world markets, and the development of computer software exports is receiving particular emphasis.”
The alarm bells were ringing due to the close Indo-Soviet relationship. But there was an interesting observation as well, that even the Soviets might not like India to be more techno-savvy.“For the Soviets, India’s growing interest in expanded ties with the West and its quest for advanced technology present both a challenge and an opportunity,” the author mentions.
It seems that the American government had shared its concern with the Indian counterparts, as there is a mention about Indian security practices as well. “Indian officials believe their security procedures are adequate to deny the Soviets access to Western technology with the government, the military and public sector enterprises.

Six years after the report was published, USSR ceased to exist and the Indian economy was liberalized. Thus one could safely assume that the American fears might not have materialized or whether they actually did.

The coming of the Internet
The CIA was also tracking with interest the evolution of the Internet. The main concern of the agency was quite obvious: the Soviets could use it to steal information. A report “Soviet and East European Computer Networking: Prospects for Global Connectivity” published in 1990 list outs those concerns. The main concern for the agency was triggered by the entry in April 1990 of the USSR, Bulgaria, Czechoslovakia, Hungary, and Poland into the European Academic and Research Network (EARN). “The entry is likely to have a profound effect on scientific communities throughout the Soviet Union and Eastern Europe. The Soviet Akademset network, for example will become considerably more value if it provides a reliable bridge to vast foreign networks,” the author states.

There is also the story about the genesis of the Internet and talks about the “primary networks in the West”, and it sure is quite informative in a brief manner. “The best example of a government-subsidized research network is the Internet, which grew out of the US ARPANET project. Starting in the late 1960s, computers at the US Government, military, and commercial organizations conducting government-sponsored research were interconnected to permit the transfer of research data and maximize the utility of computers at various sites of the network. ARPANET spawned similar network, such as the US National Science Foundation-sponsored NSFNET to provide remote access to its supercomputer centers, and these networks themselves interconnected (inter-networked) – the result was termed the Internet. The Internet carries only unclassified information, and it now extends to many foreign countries,” the author states.

In the same report, there is an interesting story about what could be the first “primed” hacking attempt. Dubbed as the ‘Hanover Hacker’, the report details the Soviet attempts to hack US systems to gain information. The interesting thing was that they employed the help of programmers in West Germany (which was then an ally of the US) for the act.
“We have already learned about one Soviet effort to carry out illicit technology transfer via the Western research networks, the case of the ‘Hanover Hacker’. According to press reports, West Germans, acting at the direction of a Soviet handlers, engaged in a form of espionage over the Internet, Breaking into user accounts on computers in the United States and Western Europe by exploiting poor network security – including easily guessed or default passwords – the Germans stole proprietary or otherwise sensitive data. Although none of the data was classified, some of it could be considered ‘technical data’ that might have been restricted from export to the USSR by COCOM,” the author states.

But he seems to be quite baffled by the Soviet attempt. “According to the information provided to the press, the Soviets paid a considerable amount of money for the information and stolen passwords they received, The sum of the damage caused by the hackers – even bearing in mind the possible effects of the transfer of technology – does not seem to be substantial enough to warrant the expenditure. The operation’s exposure, however, could be considered a fluke (the hacking was not detected by any intelligence or police agency in any of the affected countries – it was discovered and pursued by a US academic researcher) and the Soviets may have anticipated far more productive future collection from the use of stolen passwords,” he states.
Beyond that, there is not much on the Internet, maybe there is oodles of reports that are waiting to be declassified.

Unmade in China
It was not only the Soviets that the Americans were worried about, they were also keeping an eye on the fledgling dragon, so as to say. China was also looked upon with distrust and was not thought to be capable of succeeding in its modernization efforts.

A report was published in 1986, titled, “China: Science and Technology Modernization.” The evaluation is quite pithy, that China though aspiring to be a big technology powerhouse could not really become one. “China has had little success in civilian applications of high-priority, advanced-technology sectors such as semiconductors, computers, and telecommunications. This failure will in our view hold back advancement in other high-priority areas, such as the development of automated production capabilities and communications networks for business and governments,” says the author.

Even then, the authors do note that energy and focus of the Chinese authorities in its modernization efforts. There is often the talk of how the senior government officials were quite serious about the S&T modernization and had set stiff goals that had to be achieved.

The goals were to commercialize technology, increase acquisition of foreign S&T, and finally breaking down of the barriers between civilian and military research and production.
But it was a tough task, according to the authors, as work conditions tend to be poor and key skills – such as computer specialists – were still in short supply. The big plus for the modernization efforts were the hundreds of Chinese students that were returning after being educated in Western universities. They were exposed to Western technology and could be a great help in the change.

Yet there was some amount of wariness. “China’s S&T modernization program creates both opportunities and problems for the United States,” The good thing was that there were already signs of a much more “open China.” But a powerful China could also be quite assertive. “China’s growing S&T capabilities and accompanying improvements will increase Beijing’s ability to project power against the nations on its periphery,” the author states.

There is the talk of how China may be able to narrow somewhat the gap in selected industrial technologies with the industrialized world over the next decade. “The degrees of narrowing is uncertain, but catching up fully with the West in any of these areas is very unlikely.” The general outlook for progress in high-priority areas was summarized as below:

Microelectronics: Gap will widen
Computers: Gap will widen
Telecommunications: Gap will remain or widen
Automated manufacturing: Gap likely to persist
Transportation: Gap will remain fairly uniform
Energy: Gap will narrow
Special structural materials: Gap will narrow
Biotechnology: Gap will narrow

“Of course, China will try and avoid becoming overly dependent for technological assistance on the United States or any other country,” the author adds.

In retrospect, one can safely say that the author (or authors) could not have been more off the mark. The transformation at China did go out pretty well, so much so that foreign firms are shifting almost all their manufacturing capabilities to China. The country no longer is dependent on technological assistance but is a global powerhouse of the same.

The world in 2015
Among the many reports, there is one that is quite interesting, titled “Global Trends 2015: A Dialogue About the Future with Non-government Experts.” The report was published in December 2000 and does crystal ball gazing into the future. It is quite an entertaining and speculative view of the future. As we are currently, somewhere in between (in terms of time), it still makes for an interesting read.

“The integration of information technology, biotechnology, materials sciences, and nanotechnoloyg will generate a dramatic increase in innovation. The effects will be profound on business and commerce, public health, and safety,” the report notes.
It also states that in the following years, the time between the discovery and the application of scientific advances will continue to shorten.

There are many observations on the political and security trends, economic dynamism, regional interaction, regional trends. But lets talk specifically about the IT front. “Local-to-global Internet access holds the prospect of universal wireless connectivity via hand-held devices and large numbers of low-cost, low-altitude satellites. Satellites systems and services will develop in ways that increase performance and reduce costs,” the author states.

The report goes on to list the countries that will play an important role in the IT revolution. “Among the developing countries, India will remain in the forefront in developing information technology, led by the growing class of high-tech workers and entrepreneurs. China will lead the developing world in utilizing information technology, with urban areas leading the countryside. Beijing’s capacity to control or shape the content of information, however, is likely to be sharply reduced,” it says.

“Discoveries in nanotechnology will lead to unprecedented understanding and control over the fundamental blocks of all physical things. Developments in this emerging field are likely to change the way almost everything is designed and made. Self-assembled nanomaterials, such as semiconductor ‘quantum dots,’ could by 2015 revolutionize chemical labeling and enable rapid processing for drug discovery, blood content analysis, genetic analysis, and other biological applications,” the author adds. There is also the mention of how by 2015, information technology will make major inroads in rural as well as urban areas around the globe.

But there is also a rider. “The rising tide of the global economy will create many economic winners, but it will not lift all boats. The information revolution will the persistence of poverty more visible, and regional differences will remain large.”
There is quite some thought that is given to Indian IT. Obviously, it must have been the success of the Y2K that must have woken up the Agency to the potential of Indian IT.

“India’s economy, long repressed by the heavy-hand of regulation, is likely to achieve sustained growth to the degree reforms are implemented. High-technology companies will be the most dynamic agents and will lead the thriving service sector in four key urban centers- Mumbai, New Delhi, Bangalore, and Chennai. Computer software services and customized applications will continue to expand as India strengthens economic ties to key international markets,” the author adds.

On China, the author is unwilling to bet, saying that, “estimates of China beyond five years is fraught with unknowables.” But he is emphatic on Russia’s decline. “By 2015, Russia will be challenged even more than today to adjust to the expectation for world leadership to the dramatically reduced resources it will have to play that role,” he states.

So this is how the world seems in 2015, India continues to go stronger thanks to IT, so does China, but not Russia. We are half way through; it remains to be seen whether in the next few years the authors would be proven right or wrong (they seem to have miscalculated Russian resilience, Vladimir Putin seems to be doing quite well for himself and his nation, as of now).

Computers @ CIA
After talking about computers across the globe, once chances upon a rare report that gives an idea of how the computers at CIA had evolved. In a report, “30 and Thriving” published in 1991, there is the talk about technology changes at the agency. At the onset, there were the AIWAC, which was a total-batch-processing system used for U2 measurements and worked on paper tape. It was the agency’s first digital computer. Next came IBM 407 and 1401, they generated “blip sheets” -- massive paper printouts with up to seven carbon copies of historical data and blank data-entry forms. The 1401 were an improvement as they had four magnetic tape drives, and “8,000 bytes of memory”!

In 1962, came the Univac 490 systems, with 32,000 words of memory and were used for measurement and information procession. Further advancement came with the Univac 494 that brought the wonders of remote-batch processing. The agency went in for the Univac 1100s series in 1975 and in 1986, they were complemented with Sun terminals, it was then that the “light pens were replaced by the mouse.”

As of 1991, the year the report was published, the agency had two Uniysys 1100/93s’ and one Unisys 1100/91. “Together,” the author says proudly, “these systems offers 160 million bytes of memory (approximately 152 MB) and 146 billion bytes (approximately 135 GB) of storage and process 48,000 transactions a day.”

Another document honors Albert D “Bud” Wheelon, responsible for the creation of the Directorate of Science & Technology at CIA. Wheelon was mandated to create the department in a jiffy, post the Cuban missile crisis in 1962. In the report there is a quote by a person who eulogizes Wheelon and talks of how the research at CIA and other agencies have helped the progress of science.

“S&T research contributing to the processing and analysis of vast amounts of information which the agency collects daily. Research in improved data extraction, automatic database generation, machine translation and information retrieval – in multiple languages – have been major efforts supported by interagency groups, often led by S&T officers, and in several cases have led to commercial products. How else will the analyst or policymaker get the right amount of information in the form he wants on his desk when he needs it?” he says.

“The forerunner of the Pentium chip came into being because of some visionary officers in the S&T who believed in a radical concept – the RISC processor – worked with industry in the early 1980s to see that it got a fair chance – now look where we are,” he adds while giving other examples like tools for urban planners, 3D software, and facial recognition tools, etc. that have come from the agency stables.

So this is how the world of technology has evolved over the last years. Surely there are hundreds of thousands of documents and reports that have yet to see the light of the day. At least we can be sure of one thing after reading all these “Family Jewels.” There is so much more that we really don’t know. But we do know that the bald eagle is keeping an eye on all of us, making discrete notes.

Feature: On war and technology

Was reading about the BrahMos missile today, it is indeed India's finest achievement in terms of warfare technology (the disappointments have been no less, ALH, LCA, etc.). Future wars will not be fought by men but by machines is a reality. Take Desert Storm as an example or even Kosovo or Afghanistan. Technology and war go hand and hand. All this reminds me of a story that I had done long time back, in fact when the troops had invaded Iraq, that touched upon the man and machine connect. The story was published in the Financial Express. Read on.....(http://www.financialexpress.com/fe_full_story.php?content_id=31145)

---------------------------------------------------
Tech Rules But You Still Need A Man To Fight The Battle
Some 500 years before Christ, Babylon near modern-day Baghdad was a bustling centre. It is often referred to as the cradle of human civilisation. King Nebuchadnezzar, after conquering Jerusalem, built a city of unprecedented charm complete with sturdy fortresses, moat, drawbridges, temple of Baal, tower of Babel and of course the Hanging Gardens, one of the seven ancient wonders. It was sheer technology, way ahead of its times, that created these monuments.

And ironically, technology is again playing a crucial role in Iraq, this time in its destruction. War and technology go arm-in-arm. The invention of gunpowder was responsible for the colonial ambitions of Britain. The U2 boats, torpedoes and the flying bombers stoked the Third Reich’s expansionist ambitions. And technology is once more in the limelight with Operation Iraqi Freedom, the satellite-guided Cruise and Tomahawk missiles, the Joint Direct Munitions (JDM)s, and the works.

From the 1991 Gulf War to Gulf War-II, there has been a distinctive shift in the way the wars are waged, a shift in favour of technology. Not only have weapons evolved in terms of destructive power, they are also smart now. For instance, Cruise missiles can hover around the target for an hour or so, waiting for a ’go’ from the command centre. The whole operation is being run from thousand of miles away in Virginia, US and Bahrain.

War is a logistics nightmare and technology comes to its aid. With the help of technology, there is much better co-ordination of the allied force, notwithstanding cases of ‘friendly fire’. It is now possible to manage and employ a variety of means of invasion — B-52 bombers flying down from the UK, Stealth bombers flying in from Diego Garcia, Apache helicopters from Kuwait, and missiles being fired from aircraft carriers in the Persian Gulf. All this is due to the intensive computerisation programme undertaken by the US defence department. There has been a seamless integration of sensors, communication devices and the weaponry systems in a single network. And this is the difference between the Gulf War in 1991 and the current one.

Says Lt General (Retd) Vinay Shankar, “In 1991 Gulf war, the weapons were in an experimental stage. Today, they are refined and calibrated. Take the example of the Patriot missile systems, they weren’t working then, but today they are bringing down the Iraqi missiles with amazing regularity.”

Any war is waged in three stages: surveillance, target acquisition and destruction. Spy satellites, UAVs (unmanned aerial vehicle), AWACS have been hovering over the Gulf region for ages now. The command centre has created a databank of targets like Saddam’s palaces, Ba’ath party’s offices, running into thousands, and every strike is collated with the data bank. Even the missile path for each Cruise and Tomahawk missile is charted on a computer keeping in mind the shortest possible route, traditional airline fly path and even high-rise buildings.
The other important aspect is target acquisition. In 1991, Saddam could fire Scud missiles at Tel Aviv, Israel, with impunity from mobile launchers as it would take hours for the allied forces to precisely pinpoint where it was launched from and then destroy it. But, now, with the AWACS hovering at 35,000 feet over Iraq, it is a matter of minutes before the launcher is located, and its coordinates sent to the nearest F-16 fighter jet on a sortie.
Says Air Marshal (Retd) VK Bhatia, “In the past, the pilot had to make an eyeball contact with the target before firing, thus factors like visibility and weather had a bearing on the strike. But with the current precision-guided range of weaponry, these factors have been made redundant. In fact, with the current GPS (global positioning system) guided missiles, the target can be changed at the very last minute.”

This has been made possible due to the new approach of the US defence sector - the way it is embracing proven and cheaper technology. Pre-1991, the establishment went in for exclusive and tailor-made systems. But now, they are adopting, cheaper and proven technology off the shelf.

“Operation Iraqi Freedom will change the way wars will be fought in the future. This kind of precision is unassailable, striking a target while buildings on either side are spared. In fact, I feel, it is the best advertisement for American armament manufacturers. No wonder they refer to the bombardment as ‘Shock ’n Awe’. These weapons are way ahead of their counterparts, even advanced nations like France, the UK and Russia cannot match the weapon systems of the Americans,” quips Jane Defence Weekly’s correspondent Rahul Bedi.

But technology comes at a cost. The precision munitions are as much as 30 times costlier than the ‘dumb’ munitions, but by virtue of their being accurate, they curtail the need for heavy bombardment.

Where does India compare with the military might of the American forces? “Nowhere,” says Lt General Shankar adding, “We are still generations behind, in terms of technology. The country will have to spend money in an intelligent manner to leapfrog into the big club, lest we be another Iraq.”

But, Air Marshal Bhatia says that the armed forces are embracing technology much more readily. Take the 1999 Kargil War for instance, where laser-guided pounding of Tiger Hills by the Indian Air Force helped in quicker expulsion of the enemy. The Mirages were able to pound targets due to the technological advancements made.”

The war has spilled over the World Wide Web too. During the previous Gulf War, Internet was at a nascent stage. Today, it is uniting people across geographies for and against the war. Blogs have become a major source of information. Take for instance, the blog by a supposed American soldier Smash (http://www.lt-smash.us/) reporting from the front. On the other hand, there is Salam Pax (http://www.dear—raed.blogspot.com/), a resident of Baghdad, who provides an insight into these turbulent times. Scores of US websites have been defaced by anti-war protesters, while pro-war hackers have ensured that Al Jazeera’s English site (http://www.english.aljazeera.net/) does not run at all.

But as the battle reaches the urban landscape of Baghdad, the technological edge enjoyed by the allied forces may get more or less nullified, feel analysts. The Republican Guards are more atuned to the weather conditions and the terrain, and then it will be a victory of the fittest.
A centimetre off-track on a computer becomes miles on the field. As the old military man says, you can batter the enemy with missiles, but you still need a man to fight the battle.

Feature: Microsoft & Novell pact

Proprietary Vs Open Source..is an interesting debate for any tech journo, so when Microsoft closed a pact with Novell last year, how could I resist penning my thoughts on this contentious issue....The story was published in the Dataquest Magazine & on CIOL.com and got quite many comments from people from all over the world....That's best compliment for any journalist....(http://www.ciol.com/content/search/showarticle1.asp?artid=91191)

---------------------------------------------------

Divide to rule?

The Microsoft-Novell pact has really jolted the IT Industry across the globe. What are the reasons? What could be the implications? There are no real answers; only a few guesses and dollops of hope.

Raymond Noorda must certainly be turning in his grave. It has been barely a month since he left for the pearly abode and already his legacy has been undone. For over a decade, Noorda fought a relentless battle against the company at Redmond. Noorda was a former CEO of Novell and to him, William Henry Gates III was an unscrupulous usurper who needed to be stopped at all costs. Novell and Microsoft were bitter enemies, nothing less and could be more.

Thus, Microsoft came out with LAN-Man to beat Novell’s Netware and Novell went on a buying spree, for instance WordPerfect, to take on the might of Microsoft on the desktop space. Sadly, Novell wasn’t David and burnt itself hollow in its battle with the Goliath. In 1993, Noorda parted ways with Novell to establish the Canopy Group that invested in a whole lot of companies working in the open source space. Novell dragged on.

A decade or so later, Novell did a course correction and in 2003, jumped on the open source bandwagon with the acquisition of SUSE (a few months before acquiring SUSE, Novell had acquired an open source application developer company, Ximian). Despite the shift, Novell could never regain its past glory. It was a distant second to another open source major, Red Hat. That was the state a few days back till Novell decided to sellout.

Embrace, Extend, Exterminate
Since, the eighties, Microsoft has been at loggerheads with some or the all the IT companies. It is renowned for the subversive tactics that it employs to nullify opposition. “Embrace, Extend, Exterminate” is supposedly the corporate philosophy that it lives by. In its three decade of existence, innumerable companies have either been gobbled up or simply run out of existence. Gates (and now Steve Ballmer, the CEO) do not look kindly at competition.

Sun, Oracle, Apple, IBM, you name it, all have been detractors of Microsoft. Google was one of the few companies that was able to steal a march over Microsoft and establish itself as a leader in the online space. Yet, one of Microsoft’s favorite bugbears has been a product company with a cute penguin as its trademark, Linux. The open source movement is an anathema to Microsoft. The company propagates proprietary systems and is loathe to giving anything away for free or even opening itself.
‘Halloween documents’ is the name given to internal Microsoft memos that were leaked to the open source community in 1998. It is a revealing commentary on how Microsoft perceives competition, mainly Linux kernel-based operating systems. The memos dub open source software as ‘a growing long-term threat to Microsoft's dominance of the software industry.’

The documents supposedly go on to acknowledge that certain parts of Linux are superior to the versions of Windows available at the time, and outlined a strategy of "de-commoditize[ing] protocols & applications"; or basing networks and documents around proprietary standards, thus they can only interoperate with machines that work on Microsoft OS. That was at the turn of the millennium.
From competition to coopetition
Noorda in his heydays had popularized the term coopetition, i.e., cooperative competition. This philosophy is the supposed basis on which the Microsoft-Novell pact is based upon. The pact has been touted as a symbiotic breakthrough. Yet, on closer analysis, there seems to be fairly little that Microsoft seems to be getting out of the deal. But then, remember what your kindergarten teacher repeatedly asked you to rote, ‘appearances can be deceptive.’

Novell is in a rag-tag shape; SUSE-Linux was certainly not a match-winner. The deal with Microsoft seems to be god-sent for Novell. First is the cash inflow, Microsoft would be paying Novell a sum total of $380 million, that includes payment for SUSE Linux Enterprise Server subscription certificates, as money for patent cooperation. Microsoft has also dangled the olive branch, it will not sue Novell’s customers for patent infringement. It will also market Novell’s Linux version to its existing customers.

The new friends would also collaborate in the development of modern technologies in the space of virtualization, management and document format compatibility (remember the Halloween documents). So now, Linux, importantly SUSE-Linux and Windows will be interoperable. A great victory of sorts for the open source movement, or is it?

Devil is in details
How does Microsoft gain from the whole deal? Is a question that is rankling many minds. Going by Microsoft’s track record, it should not be too hard to extrapolate. The open source movement was turning out to be quite a formidable challenge for Microsoft (the likes of Google, Amazon and other Wall Street firms were using open source systems), there was still some spadework to be done.

As of now, there are two main players servicing the open source market, namely Red Hat and SUSE-Linux (Novell). While Red Hat has close to 80 per cent share of the market, Novell makes up for the rest. There are a few other smaller distributors like Ubuntu, Xandros, Linspire and others.

Strangely, just a few days before the Microsoft-Novell announcement, Oracle had decided to market its own version of open source system, quite similar to Red Hat’s. Thus, Red Hat could give Microsoft a formidable challenge in the days to come. Now in one stroke, the open source market is divided in two camps: one blessed by Microsoft and on the other end are the baiters. While it is quite fashionable for open source developers to chant ‘Win-Down’ slogan, corporates and organizations would rather prefer a more peaceful and cooperative model. The preference for interoperable systems could boost Novell’s sagging fortune and eat into Red Hat’s share (embrace and extend).

“Once the details of the agreement are clear, I anticipate Red Hat will react in some way or the other. There could be more surprises in store, in the days to come,” Bhavish Sood, principal analyst at Gartner tells CyberMedia News. He also adds that there is not much clearance on the technology roadmap. “The goals are all good. What we await is a clear cut strategy roadmap, of how Microsoft and Novell will go about achieving it.”

The irony was not lost when Ballmer made a statement at the press conference, “we’re here to announce a set of agreements that will really help bridge the divide between Linux and Windows.” Did Linux really need a bridge that was built in Redmond? Meanwhile, Ron Hovsepian, CEO, Novell, talked about how he initiated the talks with Microsoft and how in the end, “this announcement gives our customers interoperability and peace of mind all in one.”

Indian speaking
According to analysts and market sources, Indian players are quite excited at the prospects of the future. There is significant support for open source systems in India and now companies could go in for heterogeneous systems, combining both Windows and Linux. “A majority of servers in India are already on the Windows platform, this would give certain users the liberty to go in for multiple environments, using SUSE-Linux,” says Doug Hauger, chief operating officer, Microsoft (India).

Hauger also pooh-poohs the ‘embrace, extend, exterminate’ talk. “All this talk does not really make logical sense. No one owns or controls GPL (General Public License), so where is the talk of exterminating it? It all seems quite humorous,” says Hauger.

He agrees that Microsoft could look at a broader initiative in the future, involving more players like Red Hat. “This pact has really broken new ground. What I find most exciting is that how mindsets will change in the days to come. The religious fervor sort of days (oh, I do not like Microsoft!) are over and been replaced with technical and technological talk. This is the evolution towards a mature marketplace, a place where, technology will take precedence over everything else,” he adds. Meanwhile, the Novell India team seems to be in a celebratory mood already. According to sources, the top management is currently in Paris for ‘official work’.

Sleeping with the enemy
The late Noorda had supposedly thwarted two acquisition attempts by Microsoft, after a failed merger attempt. If Gary Rivlin’s "The Plot to Get Bill Gates," is to be believed, Noorda liked to refer to Gates as “Pearly" and Ballmer as "The Embalmer." According to Noorda, Pearly promised the heavens, meanwhile Emballmer dug your grave.

Hopefully it is a different Microsoft and a different strategy. Probably, history would not really repeat itself. Just one final piece of advice for Hovsepian, when you dine with the devil, make sure you do not end up on the menu. May Noorda’s soul rest in peace. Amen!

Interview: Alex Burn (COO, WilliamsF1 Team)

Alex had come down to India sometime back to sign a contract with Tata Technologies, it was then that I had met him. He seemed quite eager and enthusiastic about his trip to India and I had suggested some 'typical Maharashtrian' delicacies, like missal pav. He couldn't try it due to paucity of time, but promised that he will next time round. Am waiting to hear from him. This interaction was published in the Dataquest Magazine: (http://dqindia.ciol.com/content/cio_handbook07/GlobalCIO/2007/107022801.asp)
*************************

'IT is making us a whole lot quicker... to work at speeds similar to our race models'

The date May 1, 1994, has a special significance in the sport of F1 racing. It was the day when Brazilian Formula 1 driver, Ayrton Senna de Silva died in a car crash in San Marino Grand Prix in Imola, Italy. He was racing for the Williams-Renault team and was in a winning position, when his car crashed into unprotected concrete barrier. The death of Senna brought the dangers of the sport into the limelight.

Post, 1994, all Formula 1 racing teams have put in massive security procedures in place to ensure that such a event does not occur again. IT plays a very critical role in this aspect, as companies are using the latest computational technology for better car design.

WilliamsF1 has been in the racing circuit for around three decades and is renowned for its FW models released year after year. The company recently signed an agreement with Lenovo, who would be one of the main sponsors of the team. Sometime back, Alex Burns, chief operating officer, WilliamsF1, had come down to India to visit Tata Technologies facilities in Pune. In an interaction with Shashwat Chaturvedi from Dataquest, Burns talks about how his team is using IT and why outsourcing might not be such a bad term after all. Excerpts.

What role does IT play in the development of an F1 car?
Technology is a critical element behind the success of any F1 team. Today, without any exception, every F1 team is investing heavily in latest tech mechanisms to get the best out of their models. Take the case of FW28, used in the 2006 season, we used over 4,500 CAD drawings during the design phase. We are heavily dependent on computational fluid dynamics, telemetry and other to not only develop an F1 car but also run it well.

To be frank, the FIA has introduced rules over a period of time that effectively slow down the car. It is driven by security needs, because otherwise all the cars would be trying to increase speed aggressively. There are whole lists of tests and reports that one has to complete before launching a model. There are the wind tunnel tests and crash analysis tests. Today, with the latest cutting edge applications, not only is it more cheaper than the traditional way but also a whole lot quicker. All this is only due to IT at work.

Once the model is up and running, what is the role that IT plays?
Running an F1 car is a highly data intensive job. For instance, over a weekend of grand prix race, close to 7GB of data is generated. This data needs to be meticulously analyzed and the design changes need to be implemented quickly. Also, this requires speed. At times, we work at speeds similar to our race models. When the car is running there are thousands of sensors that are attached all across the body reporting on different parameters. To make sense out of all this data and implement changes quickly, is a job that is best done with the help of IT.

We function at the very edges of technology. Our work is quite akin to the space industry. The components have a short life, we are constantly testing and incorporating changes. It is a very dynamic industry.

There is a general feeling that F1 racing has become overtly technology driven, the cars are more like computers. Your take?
I do agree that there is a general feeling of overuse of technology but you need to understand the reasons behind it. Since racing is a very dynamic and speed driven sport, any small error can be huge, not only in financial terms but also in terms of risk to the driver. Thus, one has to ensure all the safety and security that one possibly can, this is where IT is extensively used. And to that end, I support the use of technology. But, at the end of the day, the car is just an entity in the hands of the driver and it depends on the individual skills of the driver to steer the car to the premier spot.

What is the reason behind your engagement with Tata Technologies and the benefits of outsourcing?
The very same that are driving a host of companies around the world, namely time and money. Developing an F1 model is big money, and, as I said earlier, a lot of this is in the technology costs. Using the skills and facilities of companies like Tata Technologies we intend to shorten the development time and also decrease costs. We have a production cycle from September-March. That's when we develop models for the next racing season. We would be working with Tata Technologies (Incat) on CAD models, etc for the FW29 model. Hopefully, as time goes by, we will increase our engagements with Tata Technologies. The quality of skill at Tata Technologies is high, and they have the ability and the wherewithal to put in the requisite numbers if need be for a project. It has the makings of a great marriage. We can do things faster and also in a cost-effective manner due to our association with Incat.

A word or two on the upcoming FW29?
In many ways FW 28 did not really meet the expectations of the WilliamsF1 team. We have learnt a lot from our outings in the 2006 season and we are going to apply them during the design for FW29. As we are retuning back to using the Toyota engine again, hopefully things will be very different in the coming season.

Interview: Noble Coker (CIO, Disneyland HK)

One would be hardpressed to find a person who does not know about Disneyland. But a lot many do not know that behind all that magic there is a lot of IT & Technicality. It was amazing interacting with Noble COker, who is the CIO of Disneyland HK, he is a man without airs and seems to love the challenges that keep cropping up all time. This interaction was published on CIOL, the link is (http://www.ciol.com/content/developer/newsmakers/2007/107031601.asp)
*************************

'Magic @ Disneyland

Noble Coker marched into the meeting room. As the CIO of the upcoming Hong Kong Disneyland, this was his first interaction with his local team members. For the meeting, he had prepared an agenda and was ready to thrash it out with his colleagues. Strangely, throughout the meeting the team kept mum. On being pushed by Coker, the team members would only show their appreciation for the project.

“They found it incredulous to tell their boss that he was wrong, while it was much the case in the US,” he recalls. This was Coker’s first cultural shock, but he quickly learnt the ropes and next time round he was prepared with a solution.

To say that Coker is a fast learner, would in no means be an exaggeration, after all during his college days he learnt Lao language from all the refugees who were being resettled in California and later on took classes on the language, “to pay my way through college.” He joined PriceWaterHouseCoopers as an analyst and subsequently was hired by Disney as a programmer. Rising up the ranks, he took up the challenge to oversee the construction of the fifth Disneyland in Hong Kong, and the rest as they say is history.

In a freewheeling interaction with Shashwat Chaturvedi from CyberMedia News, Coker talks about the “magical experience” at the Disneyland and how IT makes it happens. Excerpts:

Can you tell us about the use of IT at Hong Kong Disneyland Resort? How much has been drawn from the other Disney parks in the U.S. and Europe and how has the HK Disney been unique in terms of technology adoption?
At Disneyland, the use of technology can be classified under four different categories, as follows. The first one is business transaction, the use of IT in hotel reservation, merchandise dale, food point of sale, etc. The second category is communication, the use of email, IP telephony for internal communication. The third category and a rather important one for us is the safety and security for our guests and visitors right from food to park monitoring. The fourth one is the entertainment; we use a technology in a variety of ways to enrich the guest experience. Hence, we extensively use IT at our parks.

Giving a precise figure on how the common technology between the Hong Kong and the international parks is a tough call, but I can hazard a guess that it must be around 80:20 international and local mix, respectively. We have used technology in very many unique ways in Hong Kong.

You have often emphasized on creating “magical experience” for the guests, can you share with some instances on how technology is being used to create that “magical experience”?
When our guests walk out of our park, I would rather have them remembering Disneyland for a great magical experience rather than a great technological one. I want people to say, “wow, how did they do this” and our technology is geared towards creating that very experience. Starting from the website, we have created a similar experience much like the one that a guest will experience at the park. At the park itself, we are using a variety of tech applications for instance, at the park there is a wireless broadcast that synchronizes the timing through the park, this ensures all the different elements work in perfect coordination, like the parade and the floats.

We also have piped music that is running through the underlying infrastructure. Or take the case of the newly introduced FastPass at Disneyland. In the past guests had to stand in long queues to be able to enjoy their rides. Now with Fastpass, they can register themselves for a slot later in the day and comeback in that slot and enjoy their ride. This helps the guests spend more time on the rides rather than queuing up. Even our park attractions use technology to reinforce the ‘magical’ feeling. Take the case of Stitch Encounter, based on the Disney character Stitch. He dynamically interacts with the guest and his responses are based on what the guest tells him.

One of the challenges (mentioned by you) was working with multi-cultural team, how difficult or easy is it to work with diverse teams?
To be honest, working with multi-cultural teams can be extremely difficult, if one is not sufficiently prepared for it. I committed a lot of mistakes and learnt through them. Such experiences forces one to remove our filters; filters that one acquires over time. The experience can be quite humbling. For instance, when I had come here, I was trying to achieve things without understanding the significance of different cultures. Typically, Americans have a bad habit of talking first, and listening later and giving away a lot content without much context. I quickly understood these issues and got down to working them out by understanding the people and learning more about their culture.

Do you think current-day CIOs pay much (more than required) attention to this aspect?
Though companies are going global, I still fell that as CIOs, we do not pay much attention to cultural sensitivity issues. By nature CIOs are naturally project driven and focused on getting the job done, and no one frets over such things. But, I personally feel, that resolving these issues can be critical to the success of a team.

What is the IT strategy roadmap for the future, i.e., technologies that are being tested for the future?
Going ahead, we have created a New Technology Group that has representatives from all the major Disney Parks like Hong Kong, France and the U.S. The group examines all the emerging technologies across the world and then uses them at the parks. For instance, some years back, the mobile penetration in Hong Kong was much ahead of what it is in the U.S., so we perfected the mobile applications out here and now they can be cross deployed in the U.S. market.

In the times to come, convergence across varying media will be big thing in the days to come. Today media is ubiquitous; there is a plethora of devices like iPods, mobile phones, etc. The challenge will be to deliver multi-dimensional experience. Guests in the future would want a more enhanced experience, so when they visit the Tarzan tree house, they would like more information on Tarzan or even like to see a movie clip of the film. We are gearing to deliver that enriched user experience. Disneyland will always be magical.

Interview: Scott Griffin (CIO, Boeing)

Scott Griffin is well-respected in the industry for the way he has turned the tables at Boeing. For the past few decades, the battle between Boeing and Airbus has been more than just 787s and A380s; it is war that will continue for as many years. And in such a scenario, the IT infrastructure can give a strategic advantage like none other. At least for now, Boeing seems to have it processes under control more efficiently than the European giant (which was beset by quite many technological issues in the past). Griffin is ensuring that Boeing does not give away the lead. My interaction with him was published in the Dataquest Magazine, the link being:(http://dqindia.ciol.com/content/cio_handbook07/GlobalCIO/2007/107041201.asp)
*************************

'My biggest challenge is to speed up IT absorption to meet changing business requirements'

In 2006, Boeing overtook Airbus to become the world's largest civil aircraft company in terms of orders. In many ways, IT has been the driving force behind the company with over $60 bn in revenues.

Founded in 1916, Boeing was the first airline company to employ digital technology to designing an airliner, and under the aegis of the current CIO, Scott Griffin, it is committed to technology. Griffin has been with the company for over two decades, working his way up through various departments. Currently, he is entrusted with the responsibility of not only ensuring that the 155,000 employees spread across the world are connected and productive but also that the company stays a step ahead in the face of onslaught from across the Atlantic, ie the Airbus A-380. Scott Griffin, CIO, Boeing, shares his experience and vision with Shashwat Chaturvedi of CyberMedia News. Excerpts.

Boeing as a company has been evolving over the last many years, especially so in the last few years-from an aircraft manufacturing company to being an aerospace and defense technology firm. How is IT being used in this transition?
Information Technology is the lifeblood of a technology company. It is used to create digital (3D) design of parts, plan tools and processes. It enables "design anywhere, build anywhere", design reuse, design partner collaboration, globalization of the supply chain, and provides tools for increasing productivity and growth.

777 was the first commercial airliner to be designed using CAD. How have technology systems evolved at the company and how has IT been strategic to it?
777 was the first commercial, digital airplane. For the first time, we did not test the fit of the roughly 4 mn parts by building a mockup. We designed the parts and assembled them in the computer, using the Dassault Systemes CATIA CAD/CAM software. Then we checked their 'fit', using computer simulation.

Today, Boeing and our design partners do concurrent, 3D solid design of parts, plans, tools and processes. This has enabled us to take significant cycle time out of the design/build process, and will allow us to create derivative models with minimal effort.

With over 155,000 employees based across the globe, how do you ensure connectivity within the organization and what kind of IT infrastructure is in place?
Our infrastructure is global and standard. Boeing has customers, suppliers and partners in over 100 countries. The only way to provide a reliable IT infrastructure is to provide a standard IT infrastructure.

How has your role of a CIO undergone a change at Boeing?
Boeing IT has become a proactive partner in helping the Boeing business units achieve their growth and productivity objectives. Boeing IT employees support the Boeing enterprise, but we also provide revenue-generating IT through our existing programs.

Which component or technology (enterprise) will be take the major share of your company's budget pie in the future?
Collaboration systems and infrastructure will continue to be a key investment for us in the next few years. Boeing has also been closely working with various Indian IT firms. How has been the engagement so far and can you also touch on all the work that has been done through Indian soil?In 1997, I became the CIO of Boeing Commercial Airplanes and within my first year, I had built partnerships with 5 Indian IT firms. We are still working with all of them today, and have added several more. My Leadership Team and I visit those IT partners in person at least once a year.

We intend to differentiate ourselves from our competitors in the way we use IT products, not in building the best IT products ourselves. That being said, Boeing has a healthy IT business selling IT products to our customers, and we will continue to build IT solutions where we see external customer need.


What is the IT roadmap for the future and can you touch upon some of the innovations that have been brought out by your teams at Boeing?
My systems strategy in support of Boeing is to 'buy and integrate.' Our intention is to buy commercial, off-the-shelf applications and integrate them into our architecture rather than write the applications ourselves. We intend to differentiate ourselves from our competitors in the way we use IT products, not in building the best IT products ourselves. That being said, Boeing has a healthy IT business, selling IT products to our customers, and we will continue to build IT solutions where we see external customer need.

Your views on the subject of cross-cultural teams?
Diverse and cross-functional teams provide the most innovative and timely IT solutions. Boeing is a global company, and our employees, suppliers and partners are diverse in terms of culture, nationality, and geography.

You have been associated with Boeing for over 2 decades, how has the journey been, and what would you term as your high-points and the biggest challenges faced?
I have had fun in dealing directly with airline and government customers. We are a customer-centric company, and it is challenging and rewarding to work directly with our customers. My biggest challenge is shared by my CIO peers, it is speeding the absorption of enabling information technologies in order to meet changing business requirements.