Saturday, September 13, 2008
Rel Com - MTN Deal
The problem comes when Mukesh Ambani stepped into the debate stating that according to the agreement of January 12, 2006 which was splitting agreement of the two brothers RIL has first right to refusal on sale of ADAG companies. Reliance Communications, Owner - Anil Ambani has come out in anger saying that RIL is not really interested in the company and these talks are just to spoil the MTN deal from happening.
It is now believed that top MTN officials are in Mumbai to meet up with the big bosses at Rel Comm to strike the deal.
Friday, September 12, 2008
Neural Networks and Artificial Intelligence
- A biological neural network is a plexus of connected or functionally related neurons in the peripheral nervous system or the central nervous system. In the field of neuroscience, it most often refers to a group of neurons from a nervous system that are suited for laboratory analysis.
- Artificial neural networks were designed to model some properties of biological neural networks, though most of the applications are of technical nature as opposed to cognitive models.
Neural networks are made of units that are often assumed to be simple in the sense that their state can be described by single numbers, their "activation" values. Each unit generates an output signal based on its activation. Units are connected to each other very specifically, each connection having an individual "weight" (again described by a single number). Each unit sends its output value to all other units to which they have an outgoing connection. Through these connections, the output of one unit can influence the activations of other units. The unit receiving the connections calculates its activation by taking a weighted sum of the input signals (i.e. it multiplies each input signal with the weight that corresponds to that connection and adds these products). The output is determined by the activation function based on this activation (e.g. the unit generates output or "fires" if the activation is above a threshold value). Networks learn by changing the weights of the connections. In general, a neural network is composed of a group or groups of physically connected or functionally associated neurons. A single neuron can be connected to many other neurons and the total number of neurons and connections in a network can be extremely large. Connections, called synapses are usually formed from axons to dendrites, though dendrodentritic microcircuits and other connections are possible. Apart from the electrical signalling, there are other forms of signaling that arise from neurotransmitter diffusion, which have an effect on electrical signaling. Thus, like other biological networks, neural networks are extremely complex.
While a detailed description of neural systems seems currently unattainable, progress is made towards a better understanding of basic mechanisms. Artificial intelligence and cognitive modeling try to simulate some properties of neural networks. While similar in their techniques, the former has the aim of solving particular tasks, while the latter aims to build mathematical models of biological neural systems. In the artificial intelligence field, artificial neural networks have been applied successfully to speech recognition, image analysis and adaptive control, in order to construct software agents (in computer and video games) or autonomous robots. Most of the currently employed artificial neural networks for artificial intelligence are based on statistical estimation, optimisation and control theory. The cognitive modelling field is the physical or mathematical modelling of the behaviour of neural systems; ranging from the individual neural level (e.g. modelling the spike response curves of neurons to a stimulus), through the neural cluster level (e.g. modelling the release and effects of dopamine in the basal ganglia) to the complete organism (e.g. behavioural modelling of the organism's response to stimuli).
Thursday, September 11, 2008
Ipv6 - The Next Generation Protocol
Definition
The Internet is one of the greatest revolutionary innovations of the twentieth century.It made the 'global village utopia ' a reality in a rather short span of time. It is changing the way we interact with each other, the way we do business, the way we educate ourselves and even the way we entertain ourselves. Perhaps even the architects of Internet would not have foreseen the tremendous growth rate of the network being witnessed today.With the advent of the Web and multimedia services, the technology underlying t he Internet has been under stress.
It cannot adequately support many services being envisaged, such as real time video conferencing, interconnection of gigabit networks with lower bandwidths, high security applications such as electronic commerce, and interactive virtual reality applications. A more serious problem with today's Internet is that it can interconnect a maximum of four billion systems only, which is a small number as compared to the projected systems on the Internet in the twenty-first century.
Each machine on the net is given a 32-bit address. With 32 bits, a maximum of about four billion addresses is possible. Though this is a large a number, soon the Internet will have TV sets, and even pizza machines connected to it, and since each of them must have an IP address, this number becomes too small. The revision of IPv4 was taken up mainly to resolve the address problem, but in the course of refinements, several other features were also added to make it suitable for the next generation Internet.
This version was initially named IPng (IP next generation) and is now officially known as IPv6. IPv6 supports 128-bit addresses, the source address and the destination address, each being, 128 bits long. IPv5 a minor variation of IPv4 is presently running on some routers. Presently, most routers run software that support only IPv4. To switch over to IPv6 overnight is an impossible task and the transition is likely to take a very long time.
However to speed up the transition, an IPv4 compatible IPv6 addressing scheme has been worked out. Major vendors are now writing softwares for various computing environments to support IPv6 functionality. Incidentally, software development for different operating systems and router platforms will offer major jobs opportunities in coming years.Robotic Surgery
Definition
The field of surgery is entering a time of great change, spurred on by remarkable recent advances in surgical and computer technology. Computer-controlled diagnostic instruments have been used in the operating room for years to help provide vital information through ultrasound, computer-aided tomography (CAT), and other imaging technologies. Only recently have robotic systems made their way into the operating room as dexterity-enhancing surgical assistants and surgical planners, in answer to surgeons' demands for ways to overcome the surgical limitations of minimally invasive laparoscopic surgery.
The Robotic surgical system enables surgeons to remove gallbladders and perform other general surgical procedures while seated at a computer console and 3-D video imaging system acrossthe room from the patient. The surgeons operate controls with their hands and fingers to direct a robotically controlled laparoscope. At the end of the laparoscope are advanced, articulating surgical instruments and miniature cameras that allow surgeons to peer into the body and perform the procedures.
Now Imagine : An army ranger is riddled with shrapnel deep behind enemy lines. Diagnostics from wearable sensors signal a physician at a nearby mobile army surgical hospital that his services are needed urgently. The ranger is loaded into an armored vehicle outfitted with a robotic surgery system. Within minutes, he is undergoing surgery performed by the physician, who is seated at a control console 100 kilometers out of harm's way.
The patient is saved. This is the power that the amalgamation of technology and surgical sciences are offering Doctors.
Just as computers revolutionized the latter half of the 20th century, the field of robotics has the potential to equally alter how we live in the 21st century. We've already seen how robots have changed the manufacturing of cars and other consumer goods by streamlining and speeding up the assembly line.
Big Bang machine passes first test
Cheers echoed around the Cern control room near Geneva on Wednesday as scientists celebrated the first test of the world's most powerful particle accelerator.
It took less than one hour for the inaugural beam of protons to be successfully guided around the 27-kilometre ring housing the Large Hadron Collider (LHC). Scientists hope to recreate conditions just after the so-called Big Bang, 13.7 billion years ago.
Experts say it is the largest scientific experiment in human history and could unlock many secrets of modern physics and answer questions about the universe and its origins. The LHC is also the biggest and most complex machine ever made.
After a series of trial runs, two white dots flashed on a computer screen at 10.36am local time, indicating that the particle beam the size of a human hair had travelled clockwise the full length of the tightly sealed chamber 100 metres beneath the Swiss-French border.
Scientists from the European Organization for Nuclear Research, known by its French acronym as Cern, later sent another beam around the chamber in the opposite direction.
"My first thought was one of relief," Lyn Evans, LHC project leader, told journalists after the beam finished its first lap. "It is a machine of enormous complexity and can go wrong at any time, but this morning has been a great start."
Evans didn't want to set a date but said he expected scientists could conduct collisions for their experiments "within a few months".
"Baby"
"The atmosphere here was absolutely electric," British PhD student Tom Whyntie told swissinfo. "It's unbelievable to think that we're finally here and the beam has gone all the way around."
The collider is designed to push the proton beam close to the speed of light, whizzing 11,000 times a second around the tunnel. If everything goes to plan, two beams will be fired in opposing directions and smashed together in four bus-sized detector chambers, creating showers of new particles that can be analysed by powerful computers.
The Cern laboratory will start stepping up the power with the hope of reaching a new threshold of energy by the end of the year. Further increases are planned until the equipment runs at full power, probably by 2010.
This is just the beginning of a long experiment, physicist Daniel Denegri, told swissinfo.
"The baby has just been born, or I should say twins as there are two beams, but they have to grow up," he said. "Our main concern now is to increase the number of protons step by step and ensure we don't damage the machine. But it's like a new car, you don't step in it and start driving at 160km/h."
Black hole fears
Physicists around the world will be watching whether the collisions recreate on a miniature scale the heat and energy of the Big Bang, a theory of the origin of the universe that dominates scientific thinking.
The organisation has repeatedly refuted suggestions by some critics that the experiment could create tiny black holes of intense gravity that could suck in the whole planet.
"It's nonsense," said James Gillies, chief spokesman for Cern.
The organisation is backed by leading scientists such as Britain's Stephen Hawking in dismissing the fears and declaring the experiments to be absolutely safe.
Once the particle-smashing experiment gets to full speed, data measuring the location of particles to a few millionths of a metre, and the passage of time to billionths of a second, will show how the particles come together, fly apart or dissolve.
Higgs boson
It is in these conditions that scientists hope to find fairly quickly a theoretical particle known as the Higgs boson, named after Scottish scientist Peter Higgs who first proposed it in 1964. The particle is sometimes called the "God particle" because it is believed to give mass to all other particles and thus to matter that makes up the universe.
According to Denegri, there is a high chance that Cern will find the elusive particle.
"If we are lucky, it might take a year-and-a-half – or it could be up to five years. If nature chose to create the Higgs boson, we will find it," he said.
The Cern experiments could also reveal more about dark matter – the invisible mass of energy that is believed to form 96 per cent of the universe – antimatter and possibly hidden dimensions of space and time.
The SFr6billion ($5.95billion) project, first conceived in the early 1980s and organised by the 20 European member nations of Cern, has attracted researchers from 80 nations. Some 1,260 are from the United States, an observer country that contributed $531 million (SFr600 million).
But the high costs and years of hard work are definitely worth it, say the experts.
"Mankind has always been about exploration," Whyntie said. "With particle physics we are not reaching out but looking in to the fundamental building blocks of matter. We're exploring as far and tiny as we can go to understand how our universe works. What is beautiful about the LHC experiment is that it's an attempt to answer some of these fundamental questions that everyone asks themselves".
swissinfo, Simon Bradley in Geneva
Wednesday, September 10, 2008
Dell 2208WFP 22-inch UltraSharp Premium Widescreen LCD Monitor. 1680x1050 resolution. Height adjustable & pivotal stand. 4-port USB Hub. DVI/VGA/HDCP.
Panel Performance
Amazing 1680 x 1050 native resolution, large 22-inch display, blazing fast 5 ms response time, and incredible 1000:1 contrast ratio lets you view images, documents, graphics and video with extreme detail, vivid color and fluid motion.
Pure, Natural Color
The UltraSharp 2208WFP features Dell TrueColor Technology for better color representation resulting in deeper, more vibrant reds, greens and blues. With a dynamic contrast ratio of 1000:1 blacks are darker and vibrant colors pop off the screen for life-like movies, photos and games.
Wide Screen, Slim Panel
Widescreen means cinema-style viewing and an improved overall multimedia experience. Widescreen provides you with more screen real estate so you can get more stuff on your screen at once while the flat panel design makes sure not to take up very much space on your desktop. Whether you are working or playing, wider and flatter truly is better.
The UltraSharp Advantage
UltraSharp monitors are designed to fit just about any situation and give you a comfortable viewing experience. Adjust the height, tilt the panel forward and backward, swivel it left-to-right, even pivot from landscape mode to portrait mode. The UltraSharp 2208WFP adjusts to fit your optimal viewing position. Four convenient USB 2.0 ports help you avoid the hassle of running out of USB ports on your PC or reaching around the back to access them.
The Dell Advantage
Dell monitors are sleek, with ultra-thin bezels so they look great in any environment. More than just a pretty face, Dell monitors are designed and built to our highest standards, providing the quality and reliability you expect when you see the Dell logo. Each has been exhaustively tested and comes backed by a Dell Limited Warranty, so you can rest assured your investment is protected. The performance of the UltraSharp 2208WFP is one more reason why Dell is the worlds number one source for flat-panel monitors.
IBM Growth Sustained Worldwide
With over half of its revenue (58%) coming from outside the United States, International Business Machines Corporation (NYSE: IBM) has been more insulated from recent weakness in the U.S. economy than many of its peers.
Further, we are encouraged with IBM's focus on more profitable businesses, such as software and services, and exit of low-margin businesses. IBM has strengthened its position through strategic investments and acquisitions in emerging higher value segments like SOA, Information on Demand, business process services, and open modular systems.