Facebook  Twitter  Google +  Linkedin

A submarine communications cable is a cable laid beneath the sea to carry telecommunications between countries, continents, islands.

The first submarine communications cables carried telegraphy traffic. Subsequent generations of cables carried first telephony traffic, then data communications traffic. All modern cables use optical fiber technology to carry digital payloads, which are then used to carry telephone traffic as well as Internet and private data traffic. They are typically 69 millimetres (2.7 in) in diameter and weigh around 10 kilograms per meter (7 lb/ft), although thinner and lighter cables are used for deep-water sections.

 

Read More As of 2003, submarine cables link all the world's continents except Antarctica.

Trials
After William Cooke and Charles Wheatstone had introduced their working telegraph in 1839, the idea of a submarine line across the Atlantic Ocean began to be thought of as a possible triumph of the future. Samuel Morse proclaimed his faith in it as early as the year 1840, and in 1842 he submerged a wire, insulated with tarred hemp and India rubber,[2][3] in the water of New York harbour, and telegraphed through it. The following autumn Wheatstone performed a similar experiment in Swansea bay. A good insulator to cover the wire and prevent the electric current from leaking into the water was necessary for the success of a long submarine line. India rubber had been tried by Moritz von Jacobi, the Prussian electrical engineer, as far back as the early 1800s.

Submarine cables are laid using special cable layer shipsAnother insulating gum which could be melted by heat and readily applied to wire made its appearance in 1842. Gutta-percha, the adhesive juice of the Palaquium gutta tree, was introduced to Europe by William Montgomerie, a Scottish surgeon in the service of the British East India Company. Twenty years earlier he had seen whips made of it in Singapore, and he believed that it would be useful in the fabrication of surgical apparatuses. Michael Faraday and Wheatstone soon discovered the merits of gutta-percha as an insulator, and in 1845 the latter suggested that it should be employed to cover the wire which was proposed to be laid from Dover to Calais. It was tried on a wire laid across the Rhine between Deutz and Cologne. In 1849 C.V. Walker, electrician to the South Eastern Railway, submerged a wire coated with it, or, as it is technically called, a gutta-percha core, along the coast off Dover.

The first commercial cables
In August 1850, John Watkins Brett's Anglo-French Telegraph Company laid the first line across the English Channel. It was simply a copper wire coated with gutta-percha, without any other protection. The experiment served to keep alive the concession, and the next year, on November 13, 1851, a protected core, or true cable, was laid from a government hulk, the Blazer, which was towed across the Channel. The next year, Great Britain and Ireland were linked together. In 1852, a cable laid by the Submarine Telegraph Company linked London to Paris for the first time. In May, 1853, England was joined to the Netherlands by a cable across the North Sea, from Orford Ness to The Hague. It was laid by the Monarch, a paddle steamer which had been fitted for the work.

Transatlantic telegraph cable
The first attempt at laying a transatlantic telegraph cable was promoted by Cyrus West Field, who persuaded British industrialists to fund and lay one in 1858. However, the technology of the day was not capable of supporting the project, it was plagued with problems from the outset, and was in operation for only a month. Subsequent attempts in 1865 and 1866 with the world's largest steamship, the SS Great Eastern, used a more advanced technology and produced the first successful transatlantic cable. The Great Eastern later went on to lay the first cable reaching to India from Aden, Yemen, in 1870.

Submarine cable across the Pacific
This was completed in 1902–03, linking the US mainland to Hawaii in 1902 and Guam to the Philippines in 1903. Canada, Australia, New Zealand and Fiji were also linked in 1902.

The North Pacific Cable system was the first regenerative (repeatered) system to completely cross the Pacific from the US mainland to Japan. The US portion of NPC was manufactured in Portland, Oregon, from 1989–1991 at STC Submarine Systems, and later Alcatel Submarine Networks. The system was laid by Cable & Wireless Marine on the CS Cable Venture in 1991.

Construction
Transatlantic cables of the 19th century consisted of an outer layer of iron and later steel wire, wrapping India rubber, wrapping gutta-percha, which surrounded a multi-stranded copper wire at the core. The portions closest to each shore landing had additional protective armor wires. Gutta-percha, a natural polymer similar to rubber, had nearly ideal properties for insulating submarine cables, with the exception of a rather high dielectric constant which made cable capacitance high. Gutta-percha was not replaced as a cable insulation until polyethylene was introduced in the 1930s. In the 1920s, the American military experimented with rubber-insulated cables as an alternative to gutta-percha, since American interests controlled significant supplies of rubber but no gutta-percha manufacturers.

Bandwidth problems
Early long-distance submarine telegraph cables exhibited formidable electrical problems. Unlike modern cables, the technology of the 19th century did not allow for in-line repeater amplifiers in the cable. Large voltages were used to attempt to overcome the electrical resistance of their tremendous length but the cables' distributed capacitance and inductance combined to distort the telegraph pulses in the line, severely limiting the data rate for telegraph operation. Thus, the cables had very limited bandwidth.

As early as 1823,[citation needed] Francis Ronalds had observed that electric signals were retarded in passing through an insulated wire or core laid underground, and the same effect was noticed by Latimer Clark (1853) on cores immersed in water, and particularly on the lengthy cable between England and The Hague. Michael Faraday showed that the effect was caused by capacitance between the wire and the earth (or water) surrounding it. Faraday had noted that when a wire is charged from a battery (for example when pressing a telegraph key), the electric charge in the wire induces an opposite charge in the water as it travels along. As the two charges attract each other, the exciting charge is retarded. The core acts as a capacitor distributed along the length of the cable which, coupled with the resistance and inductance of the cable limits the speed at which a signal travels through the conductor of the cable.

Early cable designs failed to analyze these effects correctly. Famously, E.O.W. Whitehouse had dismissed the problems and insisted that a transatlantic cable was feasible. When he subsequently became electrician of the Atlantic Telegraph Company he became involved in a public dispute with William Thomson. Whitehouse believed that, with enough voltage, any cable could be driven. Because of the excessive voltages recommended by Whitehouse, Cyrus West Field's first transatlantic cable never worked reliably, and eventually short circuited to the ocean when Whitehouse increased the voltage beyond the cable design limit.

Thomson designed a complex electric-field generator that minimized current by resonating the cable, and a sensitive light-beam mirror galvanometer for detecting the faint telegraph signals. Thomson became wealthy on the royalties of these, and several related inventions. Thomson was elevated to Lord Kelvin for his contributions in this area, chiefly an accurate mathematical model of the cable, which permitted design of the equipment for accurate telegraphy. The effects of atmospheric electricity and the geomagnetic field on submarine cables also motivated many of the early polar expeditions.

Thomson had produced a mathematical analysis of propagation of electrical signals into telegraph cables based on their capacitance and resistance, but since long submarine cables operated at slow rates, he did not include the effects of inductance. By the 1890s, Oliver Heaviside had produced the modern general form of the telegrapher's equations which included the effects of inductance and which were essential to extending the theory of transmission lines to higher frequencies required for high-speed data and voice.

Source: Wikipedia

Toshiba just announced today they achieved the development of the all new SDXC card capable to provide up to 64GB of memory.
According to the Japanese manufacturer, the first sample will be available on November, while the official launch is scheduled for the next spring 2010. Our new memory card, featuring 64GB of storage space with the exFAT file system, will provide a writing speed of 35MB/s, and a reading speed of 60MB/s.

 

Read More
Besides this SDXC card, Toshiba also unveiled new FAT32 32GB and 16 GB SDHC cards, both providing Hi-speed transfer rate (writing speed of 35MB/s, and a reading speed of 60MB/s). These two cards will also be available next Spring 2010.

TOKYO and IRVINE, Calif., Aug. 3 -- Toshiba Corporation, a leading innovator in NAND flash memory technologies and solutions, and Toshiba America Electronic Components, Inc. (TAEC), a North American subsidiary, today announced the launch of the world's first 64GB(1) SDXC Memory Card(2) capable of operating at the world's fastest data transfer rate(3) for reading and writing to a flash memory card. The new card is compliant with the new SD Memory Standard, Ver. 3.00, UHS104. Toshiba also extended its industry leadership in memory card solutions by unveiling 32GB and 16GB SDHC Memory Cards compliant with the world's fastest data transfer rate. Samples of the new SDXC Memory Cards will be available this November, and samples of the new SDHC Memory Cards will be available in December.

The new SDXC and SDHC Memory Cards are the world's first memory cards compliant with the SD Memory Card Standard Version 3.00, UHS104, which brings a new level of ultra-fast read and write speeds to NAND flash based memory cards: a maximum write speed of 35MB(4) per second, and a read speed of 60MB per second. The combination of large storage capacities and increased data transfer rates will meet the needs of a wide range of consumer electronics applications such as digital still cameras and digital camcorders that require high bandwidth data communication. For example, digital SLR cameras will be able to shoot longer continuous bursts in the highest quality RAW format. Similarly, with these cards, it will be possible to download a 2.4GB video in only 70 seconds.

The SDXC card is the next-generation standard defined by the SD Association in January 2009. The new standard applies to cards with capacities over 32GB and up to 2TB, compared to the SDHC standard, which applies to cards with capacities over 2GB and up to 32GB.

The high level specifications and wide range of memory cards announced by Toshiba will further open the way for developers to bring applications to future generations of consumer products. By further enhancing its SD Memory Card lineups with larger capacity and a higher data transfer rate, Toshiba will continue to meet market demand and to lead the NAND flash memory market.

(1) SDXC Memory Card realizes the 64GB capacity, the largest capacity yet available in the market.

(2) Supports UHS104, a new ultra high speed interface in the new SD Memory Card Standard Ver. 3.00, which provides 104MB per second bus speed on the SD interface, and realizes maximum write speed of 35MB per sec., with a read speed of 60MB per sec.

(3) UHS104 provides the conventional SD interfaces: 3.3V DS (25MHz)/ HS (50MHz) and new SDHC Memory Cards with UHS104 are interoperable with existing SDHC host devices.

(4) Integrates highly secure CPRM copy protection technology.

Projectors are now considered as the best media to share information through presentations. To cater to different needs of the people different types of Projectors are available in the market with advanced features. The different types of projectors are:-

Read More
Desktop Projectors:
these projectors are handy and portable. As the name indicates it can fit on the desktop. A good quality desktop projector has superb features with changeable lenses, zoom capabilities, stereo sound and high definition pictures.

Home theatre projectors: this projector comes with digital units that can be connected to all the audiovisual units at home. They are portable and have the features of S-video, composite connectors, component and built in speaker.

Overhead Projectors: this projector is most popular and best too to display projected images. The portable one are made use in schools and business purpose.

Installation Projectors: these projectors are made use of showing images in large venues like conference halls, museums and auditoriums. They are remarkable with its capability to project an image up to 25 feet. A good installation projector features zoom capabilities, high resolution reproduction, interchangeable lens capabilities, tilt features, stereo speakers, a quiet cooling fan, onscreen display, a remote control and mouse.

Slide projectors: they are very popular since few decades and are still in use. They are popular devices that project photographic slides. While, the latest version in slide projectors come with a pop up screen for instant portable viewing.

Opaque Projectors: they are the oldest type designed to be used as an artist’s enlarging device. This projector helped the artists to trace the projected images and transferred it into the canvas.

Source: http://www.saveonprojectors.com

Until now, most consumer PCs have run on software from one of two companies: Microsoft or Apple.
But on Wednesday, search giant Google (GOOG) shook up the computing world by formally announcing plans to compete head-to-head against those companies on their home turf: PC operating systems.

Google gave notice that it's developing its own PC operating system initially targeted at netbooks, those pint-size, inexpensive PCs currently selling like hot cakes. Google is meeting with hardware manufacturers and hopes to have it on computers by the second half of 2010.

Read More

Google's goal is to be the opposite of today's operating systems — especially Windows, which commands 90% of the market. The ubiquitous software has a reputation as virus-prone and complicated. Google says its Google Chrome Operating System will be faster, smoother and lightweight.

An outgrowth of Google's Chrome Internet browser, the OS is designed "to start up and get you onto the Web in a few seconds," Google said in its official blog post announcing the product. Google says it can achieve that by building a system from the ground up, one that isn't constrained by working with a legacy system initially built in the 1980s.

Now, all it must to do is execute.

Unlike Windows, Chrome is an open-source project like the Linux operating system that's popular with techies, which means outside software developers are welcome to work on it. And Google believes developers who have a stake in the project will find a way to bring Chrome to a wide variety of PCs quickly, says a person with direct knowledge of Google's intentions, who isn't authorized to speak on behalf of the company.

Love it or hate it, Microsoft (MSFT) sells some 400 million copies of Windows annually. PC manufacturers —Dell (DELL), Hewlett-Packard (HPQ), Lenovo, Acer, Toshiba and more — offer Windows on most PCs. When Microsoft comes out with a new operating system — as it will in the fall with Windows 7 — that's what most consumers get when they purchase a new PC. Microsoft declined to comment on Google's announcement.

But Google's operating system will be free, compared with the average $45 per machine manufacturers pay for Windows.

"Microsoft has a real problem," says Charles Wolf, an analyst at Needham & Co. "HP can now say to Microsoft, 'We've got a great operating system (Google) that doesn't cost us anything — what are you going to do about it?' " Linux, too, is sometimes free, but it can be hard to use.

Still, for consumers, "The learning process of any change is so substantial, most people will resist it, unless Google can really show a compelling reason," says Phil Leigh, an analyst for Inside Digital Media. "Most will stick with what they know."

The battle is on

Google has been locked in a battle with Microsoft for years.
Microsoft urges consumers to use its MSN.com as a home page on the Web, to make its new Bing (formerly Live) their search engine of choice, and to use its Internet Explorer browser, effectively bypassing Google.

Google — the most visited website worldwide — countered last year with Chrome, its own browser. It says some 30 million people are using it now.

Don't like Microsoft's Office software? Google offers online word processing, spreadsheet and presentation programs that are free.

Microsoft, which has been trying to catch up to Google's dominance of search advertising (5.5% vs. 72% market share in April, according to Hitwise), recently launched Bing, a well-received search overhaul that's been advertised heavily.

In reaching for Microsoft's cash cow, the operating system, analysts say, Google is in for a tough haul.

"Google will find that it's much harder than it looks," says Roger Kay, president of Endpoint Technologies Associates. "There're all those drivers and devices that have to be supported."

Microsoft has huge customer service departments. As anyone who's ever tried to contact Google knows, there are no customer service reps to call on the phone.

Microsoft is unaccustomed to having operating system competitors, but Kay says "it will do whatever it can to fight back."

Chrome isn't Google's first operating system. With more consumers conducting searches on mobile devices, Google launched Android, an operating system for phones.

The clash of tech titans has rekindled questions about whether either has what it takes to diversify beyond their respective core businesses. Microsoft, for instance, continues to derive some 80% of its revenue from selling the Windows operating system and Office suite; this despite pouring billions into search advertising, online services, video games and other businesses.

Similarly, Google gets 97% of its revenue from online advertising, despite multiple attempts to diversify with Google Apps, instant messaging, photo-editing software and Android.

Android could get to a netbook before Chrome does: Upstart PC maker Acer announced in June that it would begin selling Google netbooks in October based on Android. Acer declined to comment for this story.

Taking it online

Trip Chowdhry, an analyst at Global Equities Research, says Google will begin getting netbook customers by targeting the 60 million users of its Gmail e-mail service. "The influencing power will be on the company that can provide a branded and exceptional online experience."

Microsoft sells 400 million copies of Windows yearly. "Can that 400 million become 800 million?" asks Chowdhry. "Not likely. Can Google's 60 million grow to 1 billion? Yes."

Matt Rosoff, an analyst at research firm Directions on Microsoft, says Google will need to counter Microsoft's strong marketing and consumer support with efforts of its own.

"It will need to devote serious marketing resources to explain to average consumers, not just tech enthusiasts, why they'd want this new OS," he says.

Analysts see Apple (AAPL) getting hurt by Google's challenge, as well.
A Google netbook at $300 would be $700 less than Apple's current entry-level laptop, the $999 MacBook.

"The growth in the market is coming from netbooks, and Apple's been missing that," says Gene Munster, an analyst at Piper Jaffray. "We believe Apple will respond with a netbook in the first quarter of next year — but it will be more expensive than Google's."

Meanwhile, Microsoft is going to be anything but quiet this year. It can fall back on deep relationships with software developers and retailers. And it will almost certainly tweak pricing and features of Windows 7 to compete, Rosoff says.

From US TODAY

Dell today (27th/July/2009) reinforced its commitment to lead the industry in energy efficiency by announcing the company’s broadest line of U.S. Environmental Protection Agency (EPA) Energy Star 5.0 compliant desktops, workstations and portables. The company also unveiled its Client Energy Savings Calculator, which allows customers to assess and optimize the power consumption and the potential energy savings of their client infrastructures.

Read More

  1. Select configurations of Dell OptiPlex, Dell Precision, and Latitude commercial client product lines offers customers the widest variety of Energy Star 5.0 compliant systems to help save money and reduce CO2 emissions. Almost all of the desktop, workstation and portable products designed by Dell today consume less than 5 watts in a low-power mode exceeding current levels set by the EPA for energy efficiency. Systems include:
    *  Dell Latitude 2100, E4200, E4300, E5400, E5500, E6400, E6400 ATG, XT2 and E6400 XFR.
    *  Dell Precision M2400, M4400, T3500, T5500, T7500, R5400.
    *  Dell OptiPlex 760, 960, 360, 160 and FX160.
  2. Dell is also helping customers reduce energy costs through flexible computing, an alternative computing model, where data or computing power are centrally stored and access to data is accessed from multiple client devices. It aims to address the needs of both end users and IT by using client-server (network based) computing and virtualization technologies to improve manageability, data security, compliance and disaster recovery.
  3. Dell’s new Client Energy Savings Calculator enables businesses to calculate the power consumption and energy savings available to their organizations using Dell OptiPlex, Vostro and Latitude systems. The online tool analyzes configuration options including power supplies, processors, disk drives and storage, memory, graphic adapters and power management. Part of what makes this energy calculator unique is that it enables users to analyze power consumption in different countries and regions.
  4. OptiPlex customers have the opportunity to have their energy management settings enabled in the factory, making it easier to reduce energy costs right out of the box.
  5. Dell’s Energy Star 5.0-compliant OptiPlex 960 enables up to 43 percent less power consumption than the previous generation of OptiPlex desktops, comes in packaging that is up to 89 percent recyclable and contains at least 10 percent post-consumer recycled plastic on the small form factor model.
  6. Recently, Dell also announced that two of its PowerEdge servers meet the EPA’s new Energy Star specifications. These actions are part of Dell’s overall strategy to help customers save money and energy from the desktop to the data center.
  7. Dell also ranked No. 1 in the first TBR Sustainability Index Benchmark Report. Dell lead the computing sector and scored especially well in renewable energy use and recycling.

Subcategories