Timothy Berners-Lee born 8th June 1955 London, England, is among the famous people we have in the ICT World. Berners-lee invented the World Wide Web (WWW) making the first proposal for it in March 1989.
On 25 December 1990, with the help of Robert Cailliau and a young student staff at CERN (Conseil Européen pour la Recherche Nucleaire), he implemented the first successful communication between an HTTP client and server via the Internet. Berners-Lee background:
Tim Berners-Lee's mother and father were both mathematicians who were part of the team that programmed Manchester University's Mark I, the world's first commercial, stored program computer, sold by Ferranti Ltd. One day when he was in high school Berners-Lee found his dad writing a speech on computers for Basil de Ferranti. Father and son talked about how the human brain has a unique advantage over computers, since it can connect concepts that aren't already associated. For example, if you are walking and see a nice tree, you might think about how cool the park is under the trees, and then think of your backyard, and then decide to plant a tree for shade behind your house. Young Berners-Lee was left with a powerful impression of the potential for computers to be able to link any two pieces of previously unrelated information.
- 1969 to 1973: Attended Sheen Mount primary school and Emanuel School in London.
- 1973 to 1976: graduated from Queen's College at Oxford University with a 1st class degree in physics.
In 1977: worked for two years as a software engineer with Plessey Telecommunications on distributed systems, message relays, and bar coding.
He then joined D.G. Nash, where he developed a multi-tasking operating system and typesetting software for intelligent printers.
In 1980, joined CERN as a consultant, while at CERN he proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers.
It was because CERN was so large and complex, with thousands of researchers and hundreds of systems, that Berners-Lee developed his first hypertext system to keep track of who worked on which project, what software was associated with which program, and which software ran on which computers. While there, he built a prototype system named ENQUIRE.
Berners-Lee named his first hypertext system Enquire, after an old book he found as a child in his parents house called Enquire Within upon Everything which provided a range of household tips and advice. The book fascinated young Tim with the suggestion that it magically contained the answer to any problem in the world. With the building of the Enquire system in 1980, and then the Web ten years later, Berners-Lee has pretty much successfully dedicated his life to making that childhood book real.
1981 to 1984: left CERN and worked at Image Computer Systems Ltd, in Bournemouth, England as Technical Design Lead, with responsibility for real-time, graphics, and communications software for an innovative software program that enabled older dot-matrix printers to print a wide range of advanced graphics.
In 1984, he returned to CERN as a follow. CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet. He said “I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and — ta-da! — the World Wide Web."
In March 1989, he completed a project proposal for a system to communicate information among researchers in the CERN High Energy Physics department, intended to help those having problems sharing information across a wide range of different networks, computers, and countries. The project had two main goals:
In 1990, with the help of Robert Cailliau, produced a revision which was accepted by his manager, Mike Sendall. He used similar ideas to those underlying the ENQUIRE system to create the World Wide Web, for which he designed and built the first Web browser, which also functioned as an editor (WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server (info.cern.ch), CERN HTTPd (short for HyperText Transfer Protocol daemon).
The first project Berners-Lee and Cailliau tackled was to put the CERN telephone book on the web site, making the project immediately useful and gaining it rapid acceptance. Some CERN staff started keeping one window open on their computer at all times just to access the telephone web page.
Since CERN had been connected to the ARPANET through the EUnet in 1990. In August, 1991, Tim posted a notice on the alt.hypertext newsgroup about where to download their web server and line mode browser, making it available around the world. Web servers started popping up around the globe almost immediately. An official Usenet 8 newsgroup called comp.infosystems.www was soon established to share info.
Berners-Lee then added support for the FTP protocol to the server, making a wide range of existing FTP directories and Usenet newsgroups immediately accessible through a web page. He also added a telnet server on info.cern.ch, making a simple line browser available to anyone with a telnet client. The first public demonstration of the web server was given at the Hypertext 91 conference. Development of this web server, which came to be called CERN httpd, would continue until July, 1996.
In 6 August 1991, the first Web site built was at CERN, and was first put on line. It provided an explanation of what the World Wide Web was, and how one could use a browser and set up a Web server.
In June 1992, CERN sent Berners-Lee on a three month trip through the United States. First he visited MIT's Laboratory for Computer Science, then went to an IETF conference in Boston, then visited Xerox-Parc in Palo Alto, California. At the end of this trip he visited a old friend Ted Nelson, then living on a houseboat in Sausalito. Interestingly, Nelson had experience with film making, Berners-Lee had experience working with lighting and audiovisual equipment in the amateur theater, and Tom Bruce, who developed the first PC web browser called Cello, also worked professionally as a stage manager in the theater.
In April 30, 1993, CERN provided a certification for web technology and program code move to public domain so that anyone could use and improve it.
In 1994, Berners-Lee founded the World Wide Web Consortium (W3C) at MIT. It comprised various companies that were willing to create standards and recommendations to improve the quality of the Web. Berners-Lee made his idea available freely, with no patent and no royalties due. The World Wide Web Consortium decided that its standards should be based on royalty-free technology, so that they could easily be adopted by anyone.
In 2001, Berners-Lee became a patron of the East Dorset Heritage Trust, having previously lived in Colehill in Wimborne, East Dorset, England.
In December 2004, he accepted a chair in Computer Science at the School of Electronics and Computer Science, University of Southampton, England, to work on his new project, the Semantic Web.
In June 2009, Prime Minister Gordon Brown announced Berners-Lee will work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force.
He was also one of the pioneer voices in favour of Net Neutrality,[and has expressed the view that ISPs should supply "connectivity with no strings attached", and should neither control nor monitor customers' browsing activities without their express consent.
I can’t wait to have a 10MB link in my house, be able to download 1GB file in 10mins, be able to call my sister in USA without a buffering signal.
I can’t wait to watch some of my favorite movie series like 24, Boston legal online with an HD quality online.
Just imagine with this marine cable you will be able to access your favorite sites that you know are really heavy and so annoying to download that you actually have to keep your self busy by either making a cup of coffee or take a bath for it/them to fully display. Sites like Youtube, omg, cnn, bbc will be accessed in less than a second.
Its true Africa is born again from the slow speeds. This has been made possible by 3 companies who have or still installing a marine cable across the Indian Ocean connecting East Africa to the rest of world through India.
First company; SEACOM launched its services in Africa on July 23, 2009.
It announced that its 1,28 Terabytes per second (Tb/s), 17,000 kilometers, submarine fiber optic cable system linking south and east Africa to global networks via India and Europe has been completed and commissioned.
The backhauls linking Johannesburg, Nairobi and Kampala with the coastal landing stations were established.
SEACOM assured unprecedented opportunities, at a fraction of the current cost (did you hear that? A fraction of today’s cost), as government, business leaders and citizens will be able to compete globally through a network as the platform to drive economic growth and enhance the quality of life across the continent.
Second company; EASSY (Eastern African Submarine cable System) is also in its final stages of completing the delivery of its marine cable to Africa. This project is funded by Eastern Telecom companies, governments and countries like South Africa , Mozambique , Madagascar , Tanzania , Kenya , Uganda , Burundi , Rwanda , Botswana , Somalia , Sudan , Mauritius , Zambia , Comoros Islands and Djibouti are driving the project. We hope to have it fully up and running in less than 10 months (almost mid 2010).
Third company, TEAMs (The East African Marine System) this was spearheaded by the government of Kenya to link the country to the rest of the world through a submarine fiber optic cable. This of course came as result of Kenya thinking South Africa wanted to control the EASSY link. So far the 1st phase has been completed as according to E-marine. This cable will be connecting East Africa to the rest of the world through U.A.E and the estimated deployment is 4900km.
By end of 2010, Africa will have over 100GB of bandwidth and that means we shall be in position to own at least 1mb in our offices, homes and again we will be that competitive when it comes to the online business. Let us all keep our fingers crossed because the best is yet to come.
Sony today announced two new Cyber-shot(R) cameras (DSC-TX1 and DSC-WX1 models) that provide unprecedented advances in low-light performance with approximately twice the sensitivity of cameras with traditional image sensors.
These Cyber-shot cameras are the first to employ Sony's new "Exmor R" back illuminated CMOS sensor technology to improve shooting in low-light scenarios, enhancing image clarity and drastically reducing grain.
"With these new "Exmor R" CMOS sensor cameras, Sony has vastly improved the customer experience for taking pictures with digital still cameras in low-light scenarios," said Phil Lubell, director of the digital imaging business at Sony Electronics. "We've all taken pictures in dimly lit situations, like blowing out candles on a birthday cake, and the results were grainy and unclear. By redesigning the way these cameras capture light, Sony is leading the industry by creating this easy way to take amazingly clear, vibrant photos in low lighting scenarios."
"Exmor R" Sensor Optimizes Low-Light Performance
Conventional image sensor architecture has required wires and other circuit elements to be positioned above the light sensitive photo-diodes, limiting the imager's light gathering capability. Positioning these elements behind the photo-diodes, Sony's "Exmor R" image sensors can gather more light, resulting in approximately twice the sensitivity compared to conventional sensors.
To further extend low-light shooting performance, the TX1 and WX1 cameras incorporate the hand-held twilight and anti-motion blur multi-shot modes introduced in Sony's breakthrough Cyber-shot DSC-HX1. Using "Exmor R" CMOS sensor's high speed, these modes capture six separate images in less than a second and utilize Sony's BIONZ(TM) processor to combine the shots into a single image of extraordinary detail and low noise.
Combining the "Exmor R" technology with hand-held twilight and anti-motion blur modes delivers a breakthrough in low-light photography. Users can now capture images of stunning detail and low noise in scenes with no more than candlelight--without flash or the need of a tripod.
Innovative Sweep Panorama(TM) and High-Speed Shooting
In addition to their breakthrough low light performance, these new cameras also include Sony's Sweep Panorama and 10 frames per second burst shooting features, which were introduced with the Sony(R) DSC-HX1 camera. The TX1 and WX1 cameras offer these features in smaller, more compact bodies that match nearly any unique style.
Capturing wide landscapes is as easy as "press and sweep." Sweep Panorama mode lets you reach beyond the traditional wide-angle lens and capture breathtaking shots. Using the high-speed "Exmor R" CMOS sensor, the cameras shoot continuously while you sweep across the scene. Using the BIONZ imaging processor, they automatically stitch the pictures together to create one stunning panoramic photo.
The TX1 and WX1 Cyber-shot models can take up to 185 and 256-degree panorama shots respectively in one easy press-and-sweep motion with an image size of 7152 x 1080 (ultra wide horizontal).
Advanced Technology and Compact Design
While the HX1 camera is a well-rounded solution for customers who are looking for high-zoom and speed in a smaller size than a DSLR, the TX1 and WX1 cameras are made for an audience that wants advanced technology in an even more compact design.
With its slim profile of just 16.5mm, the 10.2 mega-pixel TX1 offers streamlined, distinguished curves for a sophisticated look appealing to the fashion-oriented who are also looking for great performance. This model features a new operation on the touch panel that lets you scroll through images with an effortless "flick" of your finger and directly access menus on the 3-inch Clear Photo LCD Plus(TM) display.
With a Carl Zeiss(R) Vario-Tessar(R) lens, the TX1 camera lets you focus as little as 0.4 inches from your subject for extraordinary close-up shots. The 4x telescopic zoom is perfect for capturing far-away subjects, and Sony's Optical SteadyShot(TM) image stabilization helps overcome camera shake.
The 10.2 mega-pixel WX1 camera has a 2.7-inch Clear Photo LCD Plus display and is just over three quarters of an inch thin--an ideal choice for DSLR owners who also want to carry a compact, high performance digital still camera.
The WX1 camera features a Sony G lens with an extraordinary wide angle 24-120mm 5x optical zoom. This lens' f/2.4 maximum aperture offers nearly twice the light gathering capability of conventional lenses, and works together with the "Exmor R" imager and low-light shooting modes to provide low-light photography beyond the abilities of other compact cameras.
Tech Savvy Cameras
These cameras include the most recent Sony technology, including, Intelligent Auto (iAuto) mode which, recognizes scenes, lighting conditions and faces, and adjusts settings resulting in clearer images, faces with more natural skin tone and less blur; Face Detection that detects up to eight faces and optimizes focus, flash, exposure and white balance and intelligent Scene (iSCN) that delivers nine Scene Selection modes to quickly adjust for specific shooting conditions.
Pet mode is a new Sony feature that minimizes blur when shooting moving pets. This new mode also reduces glowing pet red-eye.
Additionally, the cameras have technologies Sony Cyber-shot customers have come to expect. These include Smile Shutter(TM) technology that automatically captures a smile, dynamic range optimization (DRO) that improves exposure and contrast, intelligent Auto Focus that captures fleeting moments and HD video capability that records HD movies in 720p high definition MPEG4 format.
With HD video capability, these cameras record HD movies in 720p high definition MPEG4 format for stunning large-screen home movie playback. You can record up to 29 minutes (or up to 2GB file size) in 720p format.
Pricing and Availability
The TX1 camera will be available in silver, gray, pink and blue this September for about $380. The WX1 camera will be available in black this October for about $350. Pre-sales will start in August. The cameras and a range of accessories will be available online at sonystyle.com, at Sony Style(R) retail stores (www.sonystyle.com/retail), at military base exchanges and at authorized dealers nationwide.
A submarine communications cable is a cable laid beneath the sea to carry telecommunications between countries, continents, islands.
The first submarine communications cables carried telegraphy traffic. Subsequent generations of cables carried first telephony traffic, then data communications traffic. All modern cables use optical fiber technology to carry digital payloads, which are then used to carry telephone traffic as well as Internet and private data traffic. They are typically 69 millimetres (2.7 in) in diameter and weigh around 10 kilograms per meter (7 lb/ft), although thinner and lighter cables are used for deep-water sections.
As of 2003, submarine cables link all the world's continents except Antarctica.
After William Cooke and Charles Wheatstone had introduced their working telegraph in 1839, the idea of a submarine line across the Atlantic Ocean began to be thought of as a possible triumph of the future. Samuel Morse proclaimed his faith in it as early as the year 1840, and in 1842 he submerged a wire, insulated with tarred hemp and India rubber, in the water of New York harbour, and telegraphed through it. The following autumn Wheatstone performed a similar experiment in Swansea bay. A good insulator to cover the wire and prevent the electric current from leaking into the water was necessary for the success of a long submarine line. India rubber had been tried by Moritz von Jacobi, the Prussian electrical engineer, as far back as the early 1800s.
Another insulating gum which could be melted by heat and readily applied to wire made its appearance in 1842. Gutta-percha, the adhesive juice of the Palaquium gutta tree, was introduced to Europe by William Montgomerie, a Scottish surgeon in the service of the British East India Company. Twenty years earlier he had seen whips made of it in Singapore, and he believed that it would be useful in the fabrication of surgical apparatuses. Michael Faraday and Wheatstone soon discovered the merits of gutta-percha as an insulator, and in 1845 the latter suggested that it should be employed to cover the wire which was proposed to be laid from Dover to Calais. It was tried on a wire laid across the Rhine between Deutz and Cologne. In 1849 C.V. Walker, electrician to the South Eastern Railway, submerged a wire coated with it, or, as it is technically called, a gutta-percha core, along the coast off Dover.
The first commercial cables
In August 1850, John Watkins Brett's Anglo-French Telegraph Company laid the first line across the English Channel. It was simply a copper wire coated with gutta-percha, without any other protection. The experiment served to keep alive the concession, and the next year, on November 13, 1851, a protected core, or true cable, was laid from a government hulk, the Blazer, which was towed across the Channel. The next year, Great Britain and Ireland were linked together. In 1852, a cable laid by the Submarine Telegraph Company linked London to Paris for the first time. In May, 1853, England was joined to the Netherlands by a cable across the North Sea, from Orford Ness to The Hague. It was laid by the Monarch, a paddle steamer which had been fitted for the work.
Transatlantic telegraph cable
The first attempt at laying a transatlantic telegraph cable was promoted by Cyrus West Field, who persuaded British industrialists to fund and lay one in 1858. However, the technology of the day was not capable of supporting the project, it was plagued with problems from the outset, and was in operation for only a month. Subsequent attempts in 1865 and 1866 with the world's largest steamship, the SS Great Eastern, used a more advanced technology and produced the first successful transatlantic cable. The Great Eastern later went on to lay the first cable reaching to India from Aden, Yemen, in 1870.
Submarine cable across the Pacific
This was completed in 1902–03, linking the US mainland to Hawaii in 1902 and Guam to the Philippines in 1903. Canada, Australia, New Zealand and Fiji were also linked in 1902.
The North Pacific Cable system was the first regenerative (repeatered) system to completely cross the Pacific from the US mainland to Japan. The US portion of NPC was manufactured in Portland, Oregon, from 1989–1991 at STC Submarine Systems, and later Alcatel Submarine Networks. The system was laid by Cable & Wireless Marine on the CS Cable Venture in 1991.
Transatlantic cables of the 19th century consisted of an outer layer of iron and later steel wire, wrapping India rubber, wrapping gutta-percha, which surrounded a multi-stranded copper wire at the core. The portions closest to each shore landing had additional protective armor wires. Gutta-percha, a natural polymer similar to rubber, had nearly ideal properties for insulating submarine cables, with the exception of a rather high dielectric constant which made cable capacitance high. Gutta-percha was not replaced as a cable insulation until polyethylene was introduced in the 1930s. In the 1920s, the American military experimented with rubber-insulated cables as an alternative to gutta-percha, since American interests controlled significant supplies of rubber but no gutta-percha manufacturers.
Early long-distance submarine telegraph cables exhibited formidable electrical problems. Unlike modern cables, the technology of the 19th century did not allow for in-line repeater amplifiers in the cable. Large voltages were used to attempt to overcome the electrical resistance of their tremendous length but the cables' distributed capacitance and inductance combined to distort the telegraph pulses in the line, severely limiting the data rate for telegraph operation. Thus, the cables had very limited bandwidth.
As early as 1823, Francis Ronalds had observed that electric signals were retarded in passing through an insulated wire or core laid underground, and the same effect was noticed by Latimer Clark (1853) on cores immersed in water, and particularly on the lengthy cable between England and The Hague. Michael Faraday showed that the effect was caused by capacitance between the wire and the earth (or water) surrounding it. Faraday had noted that when a wire is charged from a battery (for example when pressing a telegraph key), the electric charge in the wire induces an opposite charge in the water as it travels along. As the two charges attract each other, the exciting charge is retarded. The core acts as a capacitor distributed along the length of the cable which, coupled with the resistance and inductance of the cable limits the speed at which a signal travels through the conductor of the cable.
Early cable designs failed to analyze these effects correctly. Famously, E.O.W. Whitehouse had dismissed the problems and insisted that a transatlantic cable was feasible. When he subsequently became electrician of the Atlantic Telegraph Company he became involved in a public dispute with William Thomson. Whitehouse believed that, with enough voltage, any cable could be driven. Because of the excessive voltages recommended by Whitehouse, Cyrus West Field's first transatlantic cable never worked reliably, and eventually short circuited to the ocean when Whitehouse increased the voltage beyond the cable design limit.
Thomson designed a complex electric-field generator that minimized current by resonating the cable, and a sensitive light-beam mirror galvanometer for detecting the faint telegraph signals. Thomson became wealthy on the royalties of these, and several related inventions. Thomson was elevated to Lord Kelvin for his contributions in this area, chiefly an accurate mathematical model of the cable, which permitted design of the equipment for accurate telegraphy. The effects of atmospheric electricity and the geomagnetic field on submarine cables also motivated many of the early polar expeditions.
Thomson had produced a mathematical analysis of propagation of electrical signals into telegraph cables based on their capacitance and resistance, but since long submarine cables operated at slow rates, he did not include the effects of inductance. By the 1890s, Oliver Heaviside had produced the modern general form of the telegrapher's equations which included the effects of inductance and which were essential to extending the theory of transmission lines to higher frequencies required for high-speed data and voice.
Toshiba just announced today they achieved the development of the all new SDXC card capable to provide up to 64GB of memory.
According to the Japanese manufacturer, the first sample will be available on November, while the official launch is scheduled for the next spring 2010. Our new memory card, featuring 64GB of storage space with the exFAT file system, will provide a writing speed of 35MB/s, and a reading speed of 60MB/s.
Besides this SDXC card, Toshiba also unveiled new FAT32 32GB and 16 GB SDHC cards, both providing Hi-speed transfer rate (writing speed of 35MB/s, and a reading speed of 60MB/s). These two cards will also be available next Spring 2010.
TOKYO and IRVINE, Calif., Aug. 3 -- Toshiba Corporation, a leading innovator in NAND flash memory technologies and solutions, and Toshiba America Electronic Components, Inc. (TAEC), a North American subsidiary, today announced the launch of the world's first 64GB(1) SDXC Memory Card(2) capable of operating at the world's fastest data transfer rate(3) for reading and writing to a flash memory card. The new card is compliant with the new SD Memory Standard, Ver. 3.00, UHS104. Toshiba also extended its industry leadership in memory card solutions by unveiling 32GB and 16GB SDHC Memory Cards compliant with the world's fastest data transfer rate. Samples of the new SDXC Memory Cards will be available this November, and samples of the new SDHC Memory Cards will be available in December.
The new SDXC and SDHC Memory Cards are the world's first memory cards compliant with the SD Memory Card Standard Version 3.00, UHS104, which brings a new level of ultra-fast read and write speeds to NAND flash based memory cards: a maximum write speed of 35MB(4) per second, and a read speed of 60MB per second. The combination of large storage capacities and increased data transfer rates will meet the needs of a wide range of consumer electronics applications such as digital still cameras and digital camcorders that require high bandwidth data communication. For example, digital SLR cameras will be able to shoot longer continuous bursts in the highest quality RAW format. Similarly, with these cards, it will be possible to download a 2.4GB video in only 70 seconds.
The SDXC card is the next-generation standard defined by the SD Association in January 2009. The new standard applies to cards with capacities over 32GB and up to 2TB, compared to the SDHC standard, which applies to cards with capacities over 2GB and up to 32GB.
The high level specifications and wide range of memory cards announced by Toshiba will further open the way for developers to bring applications to future generations of consumer products. By further enhancing its SD Memory Card lineups with larger capacity and a higher data transfer rate, Toshiba will continue to meet market demand and to lead the NAND flash memory market.
(1) SDXC Memory Card realizes the 64GB capacity, the largest capacity yet available in the market.
(2) Supports UHS104, a new ultra high speed interface in the new SD Memory Card Standard Ver. 3.00, which provides 104MB per second bus speed on the SD interface, and realizes maximum write speed of 35MB per sec., with a read speed of 60MB per sec.
(3) UHS104 provides the conventional SD interfaces: 3.3V DS (25MHz)/ HS (50MHz) and new SDHC Memory Cards with UHS104 are interoperable with existing SDHC host devices.
(4) Integrates highly secure CPRM copy protection technology.