Further info is available via the FujiFilm site.
China's Anti-Satellite Weapon Test Bothers U.S. And Allies
What has come as a surprise to Western analysts is the sophistication of the launch and the weapon's ability to track and home in on a tiny satellite in the vastness of space.The US is most concerned with the test as it has demonstrated that China has the capability to knock out its military satellite system, which the Pentagon depends on for navigation and surveillance.Military analysts feel that a successful attack on 40 to 50 satellites in low orbit around the earth would seriously compromise US defense capabilities within a few hours.
Many feel that China has a right to challenge US monopoly of space. They point out that China has repeatedly urged the US to sign agreements outlawing arms in space,which Washington has refused to do.The reason given by the Bush administration for opposing a global ban on such tests, is that the US reserves its freedom of action in space. Arms control experts are not sure whether the test was undertaken to try and press the Bush administration into a global weapons treaty, or whether China was asserting its own interests in space. The timing of the test is significant as it came a few months after the Bush administration unveiled a doctrine asserting America's right to take action against any perceived threat in space.
The Soviets dismissed reports of the test as a rumor, but countries such as Canada, Britain, Japan and Australia have no doubt that it took place and plan to take up the matter with the Chinese government.China on its part denied that it had carried out any test and repeated its opposition to weapons in space.
Although the test poses no immediate threat to the US, it has once again demonstrated that China is extending its economic and commercial power to military areas. There is a strong lobby in the Republican government which perceives China as the main threat to the US. It believes that a confrontation could take place over Taiwan in future. The EU nations on their part are concerned that some kind of trade restrictions may be put in place which might affect exports of hi-tech equipment to China.
The test led Gen. T. Michael 'Buzz' Moseley, the Air Force chief of staff to order a wide ranging review of the vulnerability of US military satellites.What can the US do to protect its satellites?
High orbit satellites are safer as it takes several hours for a killer weapon to reach them.Low orbit satellites on the other hand would be difficult to protect.Methods which may be adopted are that that either such satellites follow sufficiently unpredictable orbits so that the enemy cannot launch a weapon on a collision course with them or they may be accompanied by decoys in nearby orbits. Deploying a large number of small satellites is also an option as it may not be technologically possible to knock out hundreds of satellites without allowing time for retaliation. The ability to launch replacement satellites quickly would also be an important deterrent. Laser weapons to detect and destroy such threats in time are still stories of science fiction.
More on Interesting Life
Love And Sex With Robots Are Inevitable
This is because robots will become so human like in appearance, function and personality that many people will fall in love with them, have sex with them and even marry them.It may sound a bit weird but it isn't, Levy says. Love and sex with robots are inevitable. Levy argues that psychologists have identified roughly a dozen basic reasons why people fall in love and almost all of them would apply to human-robot relationships.
Levy goes on to argue that when it comes to sex and love with robots the ethical issues on how to treat them are something we'll have to consider very seriously, and this is a very complicated issue. Levy successfully defended his thesis October 11.
Microsoft Boards Internet TV Bandwagon
Presently several broadcasters allow you to download content for a small fee.But this requires that that the file must be downloaded before it can be viewed and this may take time. Streaming on the other hand means a continuous flow of data which can be viewed before the whole file has been received.It will therefore open in a couple of seconds.But it is unlikely to remain saved on your computer.
So far the barriers to wider use of Internet TV included poor streaming technology.There is nothing more annoying than watching a favorite program and losing the stream midway.Added to this is the problem of poor video quality due to bandwidth limitations.The BBC launched the Dirac project to address this problem but it is still under development.The reason for this is that internet infrastructure was not originally designed to support video.But this problem is set to be overcome by mixing new P2P technology with the latest cache technology which will work together and combine the bandwidth of the two.
Given its future potential Internet TV is now attracting the attention of heavyweight corporations including Microsoft. On 28th September Microsoft launched the beta version of its Windows Media Center Internet TV feature.According to Microsoft US based users of Vista Home Premium and Vista Ultimate will soon be able to download an update which will allow them to enjoy TV and video content on their PC's and TV sets.The PC will not require a TV tuner for this.The streaming content will be cost supported by an advertising platform provided by YuMe.So it seems that it will be free, at least to start with.The hit programs that are available are TV shows such as 'Arrested Development' music by artists such as Chris Cornell,Pussycat Dolls and others, news from MSNBC, sports from FOX sports and a host of other programs.It will be compatible with Microsoft and third party 'media extender' devices for Windows Media Center.These devices wirelessly connect a TV with a PC, delivering TV, music and online services to any TV set in a home.These devices, according to Microsoft, are designed to deliver the ultimate entertainment experience to every TV set in your home. Microsoft is thus set to take on You Tube, Joost and Apples' iTunes in the Internet video segment.
With problems of resolution and reliability likely to be solved Internet TV is set to take off in a big way.
More on this topic on:
Interesting Life
Autos
http://interesting-life-on-net.blogspot.com/
http://planet-automobiles.blogspot.com/
The $100 Laptop Is Almost Here
What makes these machines so cheap?They use a dual mode LCD display for one.Then these low cost computers contain flash memory instead of a hard drive and use Linux as their operating system.Although Steve Jobs, CEO of Apple Computers, offered to supply free copies of the company's operating system OS X for the machine MIT declined to accept it because it is not open source.The designers clearly want an operating system they can fiddle around with. Flash memory is a non volatile computer memory that can be electronically erased and re-programmed. Its advantage is that it can retain the stored information even when the power is cut off.Non volatile memory is usually used for the task of secondary storage, or long term persistent storage as it has limitations which render it unsuitable for use as primary storage.For primary storage we still have to rely on a volatile form of random access memory.The first computers will be powered by an AMD microprocessor.Costs are expected to fall significantly when mass production of these machines begins.
The computers now called XO laptops will cost $188 to produce initially but company officials insist that $100 remains a long term goal.Presently the aim is to sell it to governments in large volumes.At the same time there are plans to introduce a commercial version that will sell for around $200, the profits going to subsidize the educational project.
The big computer companies whose initial reaction was to laugh at the project are now watching the developments with interest.The machines are scheduled to go into production in October this year and the first computers will hit US stores in January or February 2008.
Although various governments have shown interest and are even running pilot projects to evaluate the machines and to find ways to distribute them, very few have so far made any financial commitments.Achieving the goal of distributing millions of these machines is still some distance away.
Why Is Apple Slashing Prices Of iPhones?
Ten weeks on people agree that most of the praise is justified as is most of the criticism.It does things no phone has ever done before, yet it lacks some features found in even the most simple and cheapest phones.
An average iPhone costs $499 for the model with a 4 gigabyte memory and $599 for one with an 8 gigabyte memory.It is thin and sleek and sturdy.If you think it is too expensive then you would do well to remember that the price includes a cell phone ,video,iPod,e-mail terminal,camera , web browser,Palm type organizer and an alarm clock.Of all the features built into it the cell phone is perhaps the least important part.
But the best part is its superb software.Its fast and simple to operate and you don't get lost.On the flip side you can't install new programs from anyone but Apple.The camera takes excellent photos but no video.The major problem however is that Apple is in an exclusive deal with AT&T.Their network simply isn't good enough and in fact ranks almost at the bottom.But this problem is expected to improve over time.All in all it rates as a good buy.
And here's the latest piece of good news from Apple.Clearly with an eye on holiday sales Apple cut the price of its iPhone Wednesday.The price of the $599 model was slashed to $399.The $499 ,4 gigabyte model has been discontinued.
Apple claims that it wants to widen the market for its popular phone which is considered to be simply too expensive.But one would do well to remember that this is because the cell phone part is the least important part of the gadget.
Steven P Jobs the CEO of Apple claimed that they would have met their sales target of a million phones in the US by the end of September even without a price cut.But experts are not so sure.They are quick to point out that Motorola Razr the world's best selling cell phone till now hit the market at $499 but now sells for less than $100.They say that the cell phone market is like that and steep price cuts are normal.They suspect that Apple was finding that sales are beginning to stall.
A lot of the people who bought the iPhone before the price cut was announced feel cheated now that prices have been slashed within merely 10 weeks of its launch.Apple has announced that those who bought the iPhone within the last 14 days can claim a full refund if they have not opened it.In case they have opened it they can still get a refund of the price difference.
Whatever may be the reason for the price cut it's A Merry Christmas from Apple.
Data Theft and How to Stop it.
Most of the data theft is deliberate and is usually done by employees when they are leaving their company.Protecting one's self from ones' own employees is a major challenge as employees are able to use a large range of sophisticated devices which can be simply plugged into office equipment to do the job.An insignificant looking device such as a pen drive has the potential to cause large losses to an enterprise.Surprisingly most of these employees have no guilt about stealing data.Many of them in fact feel that they had a right to use the data since they had helped the company collect it in the first place.Some of the most serious data theft occurs when hackers breach the security systems of a computer and gain unauthorized access to records.Then data theft is often accidental.Laptops are frequently stolen by robbers, usually from parked cars.Often these laptops belong to senior executives of important companies and contain vital information.
The data usually stolen is in the nature of databases of names and addresses of persons and their contact details.These can then be used by rivals to steal business from a competitor.Sometimes the data that gets stolen also includes credit card details,information about bank accounts,social security numbers etc. of an individual.This is more serious as it can lead to identity theft and also the the theft of a persons money.Finally even trade secrets can be targeted.The most celebrated case is the one in March 2005 when Lexar was awarded almost $400 million when it was proved that its trade secrets were stolen and used by no less a company than Toshiba!
Data theft is occurring more frequently now because there is a ready market for such information on the one hand and because companies have paid insufficient attention to data security on the other.The most dangerous trend that has emerged is that what was earlier being done just for kicks is now being done for money and organized crime seems to have entered the scene.
One of the reasons why companies have been slow to react is that there are no laws which make companies which collect such data, accountable for its security.Although customers who are victims of bank or credit card fraud usually don't lose money the company which collected the data is not held responsible for its security.We must understand that the return of his money to a victim is not sufficient in itself.The damage that may have been caused, say to his credit history may take years to repair.
So how do companies protect themselves?The basic procedures that need to be adopted by companies to protect themselves are multi layered identification procedures for customers and more importantly encryption of stored data.
But all these things take up both time and money,which means lower profits.Therefore it is time the state/federal regulators stepped in and made sensible laws for the protection of the people
Jurassic Park - It is Possible Now !
Now about a decade later suddenly one learns that scientists have succeeded in reviving microbes recovered from deep below the Antarctic glaciers in a laboratory. These microbes are as old as eight million years .
This achievement raises all kinds of questions. The first and foremost is that by accident our scientists could bring to life some creatures or even some disease to which the modern human race has no answer. Then it is also possible that as the polar ice gradually melts due to climate change these organisms could themselves come to life. This again highlights the necessity of preserving our environment and preventing climate change.
Finally the ability of organisms to survive in such harsh conditions raises the possibility that similar organisms also exist on other planets in our solar system. The only scary possibility is that instead of finding organisms on planets similar to our own we may end up finding organisms on far flung planets having far more extreme characteristics than we had bargained for.
These are not only scary thoughts meant to panic people but to emphasize the need for sensible action in the matter.
Apple Releases iTunes 7.3.2
Apple released an update for iTunes on the 2nd of August 2007. According to Apple: " iTunes 7.3.2 provides bug fixes to improve stability and performance." Apple has not provided any other details.
A casual look at the new application does not reveal any major upgrades. However, users say that the music application now remembers the selected artist(s)/album(s) after navigating back to the library from a playlist.
iTunes 7.3.2 can be downloaded from Apple's Web site.
Top Back to School PC's
1. Dell Inspiron 531
Cost = $709 and $899 if you tale a 19 inch wide LCD monitor.
2. Apple MacBook
Cost = $1099
3. Fujitsu Lifebook P7230
Cost = $1,799
4. Velocity Micro Vector GX Campus Edition
Cost = $999
5. HP Pavilion dv2500t
Cost = $,1049
6. Gateway E-100M (Core Duo)
Cost = 1,574
7. eMachines T5226
Cost = $550
Robots in Warfare
It is this importance attached to winning wars that led to much of the development of modern robots. In ancient times the general led from the front and if we study history we find that in times of war nations paid a heavy price in terms of generals killed in battle. In modern times battlefield leaders are somewhat safer, advancing behind protective barriers of steel and artillery fire. Still casualties are high.
The ultimate dream weapon is a metal soldier which can fight the enemy thus minimizing human casualties. This is where modern robots can step in.Development in robotics have ensured that robots are now no longer confined to performing repetitive mechanical tasks in a workshop or factory. Although Bill Gates has said that personal robotics today is at a stage that personal computers were in the mid -70's, it raises the distinct possibility of an artificial human being within the next generation.
One can imagine the advantages of an army that is factory produced in various shapes and sizes according to the mission to be accomplished. There will be virtually no human casualties and the supply will be unlimited. The generals and other leaders will be miles away from the actual battlefield in safe locations. In fact the conduct of war will be in the hands of computer geeks rather than army generals.Presently, huge resources are required to provide training to army recruits and to supply food and medical aid on the battlefield. This will become a thing of the past.Training will be provided by a silicon chip and solar power would perhaps take care of the rest.
Even today, robots or mechanical devices are performing many functions, especially 'drones' which have the capability of even firing missiles.But the possibilities of their use in tasks such as clearing minefields etc. and also operating in areas with a high risk of chemical warfare or even more lethal forms of combat are endless.
Although the advantages are immense they have to be tempered with a word of caution. Once we have machines which have been given a sense of perception and the power of decision making to harm others, then what happens if something goes wrong?
Although we have not reached this stage as yet, this is a dilemma we may soon have to face and address.
more on Interesting life
Nuclear Energy - Its Future in the US
There is no doubt that human progress depends on the availability of cheap and abundant energy. Conventional energy sources are not only going to run out in the foreseeable future but will also be responsible for severe ecological damage with yet unknown consequences for the human race. New technologies like solar power and using hydrogen as fuel are still in a nascent stage and far from perfect.Moreover solar energy is very costly. Indeed, presently it is difficult to imagine that these would be in a position to displace conventional fuels in the foreseeable future.
Given this scenario it is but natural that nuclear energy is given a second look. Not only are raw materials and technology available, but they are available on a large enough scale to make it a sustainable alternative.It is estimated that using breeder reactors we can have plenty of energy for some billions of years.
So, what is holding the US back from pursuing a more aggressive nuclear energy program. The cost for are one. A single nuclear power plant costs several billion dollars to build. Then there is the security angle. A nuclear power plant is very vulnerabe to enemy attack and also to natural disasters.Given the rise of global terrorism this risk is magnified several times over.The consequences of a "kamikaze"type attack would be catastrophic. Finally there is the problem of dealing with "spent fuel."Reprocessing of spent fuel is discouraged by many countries in the mistaken belief that it will prevent nuclear proliferation. It is very clear now that with the dissemination of technical know how many countries are capable of building a bomb. This raises the question as to whether or not advanced nuclear powers such as the US should actively involve themselves in their peaceful nuclear nuclear programs. Not only would this ensure the use of "safe technology "but may also persuade the nations to abandon their military nuclear programs. Several nations including the US are meanwhile considering storing nuclear waste in deep underground shelters.
No matter what the present concerns might be, nuclear energy offers advantages that are difficult to overlook and it is only a matter of time before US lawmakers come out with a program to support its development.
"Stun Gun"- Science Fiction or Reality
Well , is it just pure science fiction or is there an element of reality in it.
For your information it is real. There are a several kinds of electroshock weapons that temporarily disable a person with electric shock. Some require physical contact while others are effective even from a distance.
The principle of operation is that it uses a temporary high voltage, low-current discharge to overcome the body's muscular mechanisms.The recepient feels great pain & can be temporarily paralyzed. But since the amount of current is low it is supposed to be safe,that is, it won't kill.The subject will experience pain, muscular contraction,dizziness and collapse if exposed to it for a long time.
Presently in use by police forces around the world are Electric Shock prods which need physical contact with the subject. Other weapons are Tasers & even long range wireless electric shock weapons are available.
Although they are safer to use than guns which fire bullets, these weapons can be extremely dangerous if the subject suffers from some medical condition. They are also not very safe during practice sessions, while being used on minors and when there are inflammable liquids around. They have also not proved to be very effective against physically strong and determined assailants as they need a bit of time to act.
Nevertheless given the rising number of police shootings in the US development of this technology raises hopes of an effective enforcement system where violence and fatalities will be minimized.
IBM triples performance of World's Fastest Computer and breaks the "Quadrillion" Barrier
The result is a machine that towers over other systems. It enables science and commercial supercomputing to attack vital problems in ways never before possible -- modeling an entire human organ to determine drug interactions, for example. Drug researchers could run simulated clinical trials on 27 million patients in one afternoon using just a sliver of the machine's full power.
IBM researcher Shawn Hall inspects a new Blue Gene/P supercomputer. The IBM system will be capable of up to three thousand trillion calculations per second.
"Blue Gene/P marks the evolution of the most powerful supercomputing platform the world has ever known," said Dave Turek, vice president of deep computing, IBM. "A new group of commercial users will be able to take advantage of its new, simplified programming environment and unrivaled energy efficiency. We see commercial interest in the Blue Gene supercomputer developing now in energy and finance, for example. This is on course with an adoption cycle -- from government labs to leading enterprises -- that we've seen before in the high-performance computing market."
A Green Design Ahead of its Time
The Blue Gene supercomputer line was born from a visionary IBM initiative to develop a hugely scalable and highly reliable scientific computing platform. With Blue Gene, designers sidestepped two key constraints on state-of-the-art supercomputing -- power usage and space requirements. The Blue Gene supercomputer was purpose-built to fit in smaller spaces and use less electricity compared to other commercially available designs. Today, the Blue Gene/P supercomputer is at least seven times more energy efficient than any other supercomputer.
The influence of the Blue Gene supercomputer's energy-efficient design and computing model -- once considered exotic -- can be seen everywhere in the industry where people have attempted to lower energy use and get performance without traditional reliance on chip frequency. The breakthrough BlueGene supercomputer design uses many small, low-power embedded chips each connected through five specialized networks inside the system.
Some of the world's leading research laboratories and universities have already placed orders for Blue Gene/P supercomputers. The U.S. Dept. of Energy's Argonne National Laboratory, Argonne, Ill., will deploy the first Blue Gene/P supercomputer in the U.S. beginning later this year. In Germany, the Max Planck Society and Forschungszentrum Julich also plan to begin installing Blue Gene/P systems in late 2007. Additional Blue Gene/P system rollouts are being planned by Stony Brook University and Brookhaven National Laboratory in Upton, N.Y., and the Science and Technology Facilities Council, Daresbury Laboratory in Cheshire, England.
"We view the installation of the Blue Gene/P system as the next phase of a strategic partnership furthering advances in computation in support of breakthrough science," said Robert Rosner, director, Argonne National Laboratory.
At FZ Julich, where researchers have been using a Blue Gene/L machine for two years, a Blue Gene/P system will allow for more breakthrough science -- in such areas as particle physics and nanotech, for example -- while keeping the research facility within acceptable power budgets. "The big computing power at low electricity rates allows us to boost the performance of very complex and computationally intensive algorithms," said Thomas Lippert, director of the supercomputing center at FZ Julich.
Inside the Fastest Computer Ever Built
Like its predecessor, the Blue Gene/P supercomputer is a modular design, composed of "racks" that can be added as requirements grow.
Four IBM (850 MHz) PowerPC 450 processors are integrated on a single Blue Gene/P chip. Each chip is capable of 13.6 billion operations per second. A two-foot-by-two-foot board containing 32 of these chips churns out 435 billion operations every second, making it more powerful than a typical, 40-node cluster based on two-core commodity processors. Thirty-two of the compact boards comprise the 6-foot-high racks. Each rack runs at 13.9 trillion operations per second, 1,300 times faster than today's fastest home PC.
The one-petaflop Blue Gene/P supercomputer configuration is a 294,912-processor, 72-rack system harnessed to a high-speed, optical network. The Blue Gene/P system can be scaled to an 884,736-processor, 216-rack cluster to achieve three-petaflop performance. A standard Blue Gene/P supercomputer configuration will house 4,096 processors per rack.
For Programmers, Friendlier Interfaces & Application Compatibility Speed Productivity
There are some key differences between Blue Gene/L and Blue Gene/P supercomputers. In hardware, the Blue Gene/P supercomputer moves to more (four vs two) and speedier (850 MHz vs 700 MHz) processors per chip; more memory and an SMP mode to support multi-threaded applications. This new SMP mode moves the Blue Gene/P system to a programming environment similar to that found in commercial clusters. The Blue Gene/P supercomputer dramatically scales up collective network performance to minimize common bottlenecks inherent in large parallel-computing systems. Software marks the third key upgrade for the Blue Gene/P solution -- system management, programming environment and applications support have all been refined in Blue Gene/P.
In Germany, a Blue Gene/P supercomputer will become the platform for new applications scaled for petaflop-level performance at the Max Planck society. "The next-generation Blue Gene system will improve our capacity to prepare, develop and optimize applications from the Max Planck Society for future peta-scale computing," said Hermann Lederer, head of application support at Max Planck's RZG/Garching Computing Center.
The Blue Gene supercomputer operating system is based on the open-source Linux operating system. Applications are written in common languages such as Fortran, C and C++ using standards-based MPI communications protocols. The Blue Gene/P supercomputer is compatible with the diverse applications currently running on the Blue Gene/L supercomputer, including leading research in physics, chemistry, biology, aerospace, astrophysics, genetics, materials science, cosmology and seismology.
A variety of independent software vendors have plans to port existing tools and applications to the Blue Gene/P supercomputer. These include Gene Network Sciences, TotalView Technologies, Inc., Tsunami Development LLC and Visual Numerics, developers of IMSL.
A robot that walks on water
The basilisk is well known for its ability to run across the surface of a body of water at a very fast rate of up to 1.5 metres per second. Check out this amusing video of the "Jesus Lizard" in action to see how it manages this. It's distinguished from other water-riding animals and insects by the fact that it doesn't use surface tension to keep it afloat, instead elevating and propelling itself by the slapping motion of its large, webbed feet.
The Water Runner Robot is designed to operate using the same principles. The CMU NanoRobotics research team spent a lot of time studying the motions of the basilisk to learn to mimic and then optimize the water-running motion to generate enough lift and thrust to sustain and move a robot far heavier than the lizard itself.
See the full research paper here (PDF).
Fujifilm announce 12-megapixel compact digital camera
The compact F50fd features a 3.0x optical zoom, with consumers seeking higher capabilities in this area having the option of another new member of the FinePix stable - the SLR-styled 8-MegaPixel S8000fd which features an impressive18x Optical Zoom. Among the key enhancements in the the new F50fd is improved Face Detection. Like previous models the camera can detect up to10 human faces in a scene and automatically identify and optimize focus, exposure and white balance to ensure that the selected face is captured in the shot. This is achieved in just 5/100's of a second no matter where the subjects are located within the frame and unlike previous versions Face Detection does not require frontal shots - it can identify profiles at up to 90 degrees and work with angles to 135 degrees if the subject is leaning forward or lying down.
The F50fd utilizes a 7th generation FinePix Super CCD-HR chip and the RP Processor II delivering optimum results in poorly lit situations with ISO settings of up to ISO1600 at full resolution and ISO 6400 at reduced resolution (3MP or lower). The "Intelligent Flash" takes advantage of the high speed capabilities by automatically adjusting its intensity to minimize wash-out and the Dual Shot mode provides further versatility by shooting two images in rapid succession - one with the flash and one without - and saves both for later comparison. Red- Eye Removal functionality has also advanced with automatic correction of red-eye immediateley after taking the shot - in this instance both the original and corrected file are saved so that comparisons can be made later.
Viewing is via a 2.7" high-resolution 230,000 pixel wide angle view LCD and up to 100 small images can be displayed on the screen at one time to speed-up the navigation process.
Usability is further enhanced by the inclusion of a Dual Image Stabilization mode - this combines a mechanically stabilized CCD sensor with high ISO sensitivities designed to deliver reduced blur from hand shake and subject movement.
The FinePix F50fd will debut in September 2007 at a retail price of US$299.95. Additional models include the FinePix S8000fd, an 8- MegaPixel, SLR-styled 18x Optical Zoom camera US$399.95, the 8-MegaPixel, 4x Optical Zoom FinePix F480 US$179.95 and the 9-MegaPixel entry level model with a 4x Optical Zoom FinePix A920 US$199.95.
Othe key specs of the FinePix F50fd include:
Intel accused of breaching European antitrust rules
The Commission sent Intel a Statement of Objections on Thursday, giving it 10 weeks to reply. A Statement of Objections is a formal accusation of antitrust violations.
Intel abused its position in three ways, according to the Commission: by offering rebates to PC manufacturers that buy the majority of their processors from Intel; by making payments to some manufacturers to encourage them to delay or cancel products using AMD processors , and by selling processors below cost when bidding against AMD for contracts with server manufacturers.
After Intel replies to the charges, it can request an oral hearing. If the Commission remains convinced that Intel has abused its market position, it can fine the company and order it to stop the anticompetitive practices, the Commission said in a statement Friday.
Representatives of the Commission, Intel and AMD could not immediately be reached for comment. AMD has already filed antitrust complaints against Intel in Europevarious parts of the world.
Smart video advertising at the petrol pump – rolling out in 115 countries globally
Fuel sales account for 70 percent of convenience store revenues, yet 68 percent of the industry's gross profit dollars come from in-store sales, according to a recent survey. With only 53 percent of gas customers visiting the convenience store, there’s a clear opportunity to raise profits just by getting people in the door – and a new partnership between digital merchandising specialists EK3, software giant Microsoft and petrol pump manufacturer Dresser Wayne is about to go all-out to convert gasoline buyers into convenience store customers.
Using video screens built into the petrol pump apparatus, the partnership will display targeted video advertising to highlight in-store deals and promotions and draw people into the store. The EK3/Dresser Wayne partnership allows station owners of any size to customize promotions, offer printable coupons, build brand loyalty and engage in "day-parting" - such as promoting coffee and breakfast foods in the morning and soft drinks and snacks in the afternoon.
The media is loaded through a simple Web interface that enables new promotions to be put together and trialed very quickly – opening up opportunities for store owners to sell advertising space to other local businesses as a further revenue stream. Owners of multiple stores and national petroleum networks can “narrowcast” their nationwide or network-wide offers with similar ease.
Under the new partnership, the system will be rolled out in a massive 115 countries around the world through Dresser Wayne’s gas pump distribution networks. It’s already being utilized in some chains, such as BP sites and The Home Depot gasoline stations and the Army Air Force Exchange Services in the USA. It is due to launch in Europe in September.
Research Data Drives Effective Creative Strategy
It's that time again . . . time to get the ball rolling on your new membership recruitment campaign, or your seasonal ad campaign, or your annual meeting promotion. You need an idea, a direction, an inspiration to guide your creative mind to a result that will be executable, will reach and resonate with the intended audience, and come in within budget. Where do you turn? Hopefully, you turn to the potential customer, in the form of primary research.
The more you know about the audience for any marketing effort, the more effective that effort will likely be. You know the challenges they face, you know the mindset they use on a daily basis, you know what they need, and can make your concepts, copy and offers sing to the audience in a way that creates action, but only if you have the information you need. The way to get that information, in a reliable way that you can use to make decisions, is to be in regular contact with the audience. One of the most effective ways to do that is with periodic in-depth phone research.
Get a Reality Check
In-depth phone research, when combined with some written survey work on a periodic basis, can help you get an accurate feel for your members or target audience on an ongoing basis, unfiltered by the "pick the middle choice" phenomenon of printed surveys. Done in a truly blind fashion, where the audience has no idea your organization is behind the questions, customers feel secure enough to answer honestly and directly. Even so, most respondents in a small, highly specific prospect pool, especially in a member-based organization, figure out that the word will filter back to your organization eventually, so they feel that this may be an opportunity to air their gripes and get something done on their behalf without complaining directly to you. You can also gather information on the positive side as well, as compliments are far more rare then complaints from customers or members of the organization.
Customer service benefits aside, true primary research generates not only anecdotal information on your current customers or members, but if you include ex customers or former members in your scheme, there is quantitative data generated that can be projected accurately over the entire audience or prospect pool. And in that data is where the creative inspiration hides.
Draw Comparisons
Inspirational data often comes from the most unexpected numerical comparisons. Most marketing data mirrors the expectations that were built into the questions in the phone survey. In the face of that effect, there is often one set of data that stands out as an unexpected result, either very positive, or extremely negative compared to your own "feel" for that issue.
The other comparison that lends itself to driving a creative "hook" is the comparison between the data from your current constituents and your former constituents. Not only will this comparison show you what facets of your organization are working well and retaining customers, but it will also show some of the reasons why the ex-customers left. Those are the things you can address in your creative strategy to shore up those perceptions that could be discouraging potential customers from doing business with you.
Often an issue you feel is of little consequence turns out to mean an awful lot to the constituent audience. If you find that unexpected "key to their heart", that should inspire a creative approach that will yield considerable success. Both in the concept and in the copy, hitting that high note repeatedly based on solid research is usually a home run.
Careful reading and interpretation of that collected data is key to going in the correct direction. Sometimes some additional follow-up research with a small but representative audience to drill down on that unexpected issue can generate some additional, more leading data. That clarification can mean the difference between a home run and a wiff.
Occasionally, the opposite scenario plays out, and something you've been promoting as a benefit all along turns out to have little importance to the audience. That lack of "resonance" is a disconnect that you now know you can avoid in your copy. That frees up some room to play up the positive aspects you've verified with the research data.
Use The Data You Gather
Without the underpinnings of that research, there is little basis for decision-making in the creative process. The data can give you a more sturdy brand profile, it lets you make a persuasive case to senior management, and gives you something to backstop your creative direction. The temptation is often to take the data and twist it to meet the "gut feel" that exists in the collective mind of the organization.
Ignore the data at your own peril. If the study is conducted by professional researchers, and there are no clear flaws in the list of respondents and its reflection of the audience is accurate, then let the data drive your decisions.
The data doesn't lie. It's very easy to discount research data when you compare it to your own perceptions, or the preferred perception of the organization, and it doesn't match. It's tougher to stick to your guns, believe the data and act upon it. Once you see it work predictably and successfully, you learn to trust the numbers.
Prioritize the Issues
Once you have the data collected, and the analysis done, how do you make the leap to a creative direction? The secret is in the numbers. The basic strategy is that you determine the type of approach based on the read of the top 5 factors in the survey in order of importance. If the top three involve emotional issues, rather than the rational, or intellectual, then the creative approach leans toward a more emotional appeal.
For example, if the survey indicates that your organization is not producing results for customers in a particular area, maybe customer service or responsiveness - those are largely emotional issues, as no one likes to feel ignored or not served adequately, but they are not functional issues or operational issues within the organization's functional mission. The creative approach in that case might involve imagery and copy that plays upon the warm, service-oriented nature of the organization, a one to one approach that is more welcoming and almost apologetic. Of course, you can also pass the info on to the customer service department and improve there operationally as well.
If you uncover among your top five factors that numerically your satisfaction level among customers is 3 times higher than your ex-customer dissatisfaction ratio, there's a set of numbers to crow about, and you can take a more rational, numerical approach to the concept and the copy - show you're keeping customers happy and keeping them longer than ever before. The data still drives the point home, and works to provide you with a creative direction, a springboard toward a winning concept that resonates with the audience.
Use A Metaphor
One of the simplest ways to make the leap from data to concept is to use a metaphor that explains what the data reveals. If you're trying to illustrate that your company grew its customer base by 200% in the last quarter, or that your customer satisfaction rating improved by 3x over the last year based on some changes you've put in place, showing images of outrageous growth - beanstalks, elephants, Cyclops giants, etc.; or show images of size disparity - big bones with little dogs, big sandwiches with little kids, an Oreo cookie so large it won't go in the glass of milk. The metaphor gives you a way to explain the concept that the data revealed in a way the audience can relate to easily.
Now, on to those meeting ads, or those membership recruitment ads. Let the data be your guide in these cases as well. If your data shows that 80% of your members don't go to your annual meeting because it's too expensive, takes too much time away from the office and the same people go every year and it's turned into a good ole' boys club, its time to break out the big guns. They are not finding the value in your meetings. Time to fight the perceptions with your own reality and show the members in your ad or brochure that there are benefits to spending the money, taking time away and meeting those good ole' boys face to face. Imagery in this case should be very rational, practical, businesslike, and copy should be extremely benefit-laden, addressing those concerns head on in a way the audience can relate to.
In many cases, if you get one good lead, one good tip, meet one solid useful connection at a meeting, you've made the trip a worthwhile endeavor. Now multiply that by the "possibilities" of the number of typical attendees (some latitude allowed here, no accountants in the wings), and show how the value multiplies with the number of participants - sort of a "you have to show up to win type of approach".
Destination "X"
Ads focused on the destination are destined to fail for at least a portion of the audience, yet they persist and even proliferate in the member organization landscape. Everyone knows it's great to go to a meeting in "X" city, if you like that city, and if it has something inherently beneficial or relevant to the meeting's purpose. If not, you'll lose the folks who are farthest away and those that are the most cost conscious, almost automatically. No matter what city you pick, those two audiences are lost if the content isn't up to snuff. You can't have a meeting good enough to get them to go there. For those who are having trouble finding value in the content, the city is irrelevant. If the content is good and the results beneficial, you can have the meeting in a train station and people will attend.
Use Testimonials
For those organizations hunting for new members, there are many approaches where the data can give you some insights to follow. Testimonial approaches are a very strong framework from which to build value for prospective members. They humanize the organization, provide benefits the audience can relate to easily, and put a face to the issue of keeping members involved and active. Your research data sets showing the biggest challenges members or customers face are the key to crafting solid testimonials that answer these challenges.
You can use the top 3-5 problem areas the data reveals and create a series of ads or brochure pages featuring members explaining how their involvement in the organization helped them solve the problem or meet the challenge. They would be highly credible, they would show the organization at work, and they would outline very relevant benefits that would resonate with the audience to a high degree - all driven by a few questions in your phone research survey.
Use Everything Available
There are many creative approaches buried within your primary research, and there are many sources of data that can be used to augment, support and reinforce your primary data and the subsequent analysis. Member application data, tradeshow or annual meeting attendee data, industry atlases or SIC code studies published by the U.S. Department of Labor, can all shed light on your target population. There are other kinds of research as well that will generate data, including focus groups, written or e-mail surveys, web surveys, live interviews at meetings or tradeshows, and live long-form personal interviews at a research facility equipped with one way mirrors and camera equipment.
All these are viable forms of information gathering, and each has their place in providing you data you can use to form a creative approach to your outreach marketing. The key is to believe the numbers and use them in conjunction with your internal organizational knowledge to drive an effective creative strategy.
Video Marketing For Free Traffic
Using video marketing to drive traffic to your web site is an online marketing strategy many small business owners and internet marketers are beginning to embrace, with much success. Having a traditional website only allows you to reach those people who first find your website. However, combining video, social networking and some simple video marketing techniques can drive hordes of qualified visitors to your website.
First, let’s forget the silly videos you’ll find all over YouTube of kids running into fences and demonstrating the newest dance move. While that kind of video can bring in hundreds of thousands of views, it won’t bring the targeted traffic you need.
Instead, consider making a video tightly targeted towards your niche. A real estate agent might make a video introducing herself and showcasing a few of her available homes for sale. A night club might make a video “commercial” with soundbites from partygoers. A software developer might make a video demonstrating his latest application.
Because Google and other search engines are beginning to give videos hosted on sites like YouTube preferential search engine ranking, it’s quite possible your video could end up on the first page of search results for your targeted keyphrase. This is incredibly powerful and not to be overlooked, as this is what will make your video marketing efforts well worth the time you invest in them.
Consider that YouTube itself may not have a huge market of people looking for videos on “Oakland real estate.” But if your video titled “Oakland Real Estate” made the first page of Google search results (again, due to the preferential search results video is receiving in the search engines) you’d benefit from the hundreds of people who search for that term in Google seeing your video as the #1 result and ,in turn, watching your clip.
With the preferential treatment videos are receiving in search results, the question then becomes, “How do I move people from watching my video on YouTube to actually visiting my website?”
This is simple. Bribe them, at the end of your marketing video. What follows are some ideas:
* Offer them something for free at your site (a consultation, a report, free drink, demo version, MP3 download, etc.)
* Poll them or ask them a question they need visit your site to answer. People love to give their opinion. You can combine with the free offer, above, by giving them the freebie upon completion of the poll or question. This is invaluable for market research.
* Leave ‘em hanging. Don’t tell the whole story on your YouTube hosted video – instead, tell them just enough to incite curiosity. Then, instruct them to visit your site for the complete story or answer.
Each of these ideas are intentionally broad can be focused indefinitely and molded to fit your target market.
The key here is catching your audience while they’re hot; directly after having watched our video and giving them a reason to continue on to your website. The truth is that if you don’t, most will simply click through to another video or search result. Capitalize on their attention and tell them where to go and what to do next – you’ll be surprised at how many will comply!
Even if you’re no Spieldberg (I know I’m not) you can produce traffic sucking videos with these simple methods.
Linux phone goes on sale
The site warns heavily that these phones are for developers not the general public.
It's based on an official standard for a Linux mobile called Openmoko, although FIC does appear to be driving this standard.
There's basically two versions of the Neo – a Base model for $300 and the Advanced version for $450. Each model comes in either black/silver or white/orange.
They're both very carefully described as 'developer preview' phones. That means that lots of bits haven't quite been integrated yet.
Perhaps the most important missing feature is integrated GPRS data access. The site also says that you shouldn't expect a reliable means of making phone calls, either.
Other fairly vital bits missing including an inability to send or receive texts; proper Bluetooth integration and an ability to set network preferences.
But – hey – this is the open source world and it shouldn't be long before people start to work out how to fix such things.
Slightly more worrying, however, is the fact that integrated GPS (satellite) isn't mentioned although it was a major feature when the phone was first announced.
So only the brave will be logging onto the official Openmoko site to buy one.Teensurance Tracks Teens on the Road
With the GPS unit installed, parents and teens can set speed, distance, and time limitations and be notified via text message, email, and phone calls if and when any are crossed.
Surveys show "every 16 year old and 17 year old thinks they are a better driver than mom and dad, but they get easily distracted," says Jim Havens, Safeco's vice president of customer solutions. With cell phones in the hands of inexperienced drivers, there are even more ways for new drivers to become distracted. A survey of more than 1,000 16- and 17-year-old drivers by AAA finds that 61 percent of teens admit to risky driving habits; of them, 46 percent say they text message while driving and 61 percent say they talk on cell phones.
With my oldest a few years away from driving, suddenly a tool like this makes some sense—if it's used as a way to help teens monitor and adjust their driving behavior as they take to the road. If parents use it to revoke driving privileges at the first sign of a surpassed speed limit, then it's just a Big Mother tool and not very useful.
Of course, it is easy to disable. But parents will get a message telling them when Teensurance is offline, Havens said. He says the company has anecdotal evidence that the system is helping teens become more aware of their driving behavior and adjusting it when needed. Still, there is not enough data yet to prove its effect on teens warrants lower premiums for families with teens who use the $14.99-a-month service. For the extra $15, families with teens also get the ability unlock a door remotely if keys are locked in the car and access to roadside assistance in case of an emergency.
Congress to Examine Google-DoubleClick Deal
Within days of the deal’s announcement in April, companies including Microsoft, AT&T and some in the advertising industry, began to complain that the merger of Google and DoubleClick would limit competition in the online advertising market. Privacy groups, meanwhile, voiced concerns about the deal’s impact on consumer privacy. In May, the Federal Trade Commission began an investigation into the proposed merger.
Now, a subcommittee of the Senate Judiciary Committee is planning to call a hearing to explore the antitrust and privacy issues raised not only by the Google deal but also by recent consolidation in the online advertising market, according to a person familiar with the planned hearing.
Bobby L. Rush, the Illinois Congressman who is chairman of the House Energy and Commerce Committee subcommittee on consumer protection, said he had opened an investigation into the privacy and competition issues raised by the Google-DoubleClick deal and also planned to call a hearing.
“There is widespread concern about the proposed merger between Google and DoubleClick that the Federal Trade Commission currently is reviewing,” Mr. Rush wrote in a letter to the commission, which is posted on his Web site. “I share these concerns and am writing to notify you that the subcommittee is considering holding a hearing when an appropriate date becomes available.”
Without addressing the planned hearings directly, Google said in a statement that it believed that the deal would not harm competition and would withstand scrutiny.
No date has been set for either the House or Senate hearings.
The Google-DoubleClick deal precipitated a wave of consolidation in the online advertising industry, including Microsoft’s proposed acquisition of aQuantive, a DoubleClick rival, and Yahoo’s acquisition of Right Media, which runs an online advertising marketplace.
But while those last two deals were quickly cleared by antitrust regulators, the Google-DoubleClick merger has drawn more intense scrutiny.
Google, which dominates the business of placing text ads alongside search results and on sites across the Web, is expected to capture 27.4 percent of the $21.7 billion in United States online advertising in 2007, according to eMarketer, a research firm. The acquisition of DoubleClick would turn Google into a dominant player in the business of serving banners and other graphical ads that appear on Web sites.
Xbox chief defects to games firm
Peter Moore oversaw the launch of the Xbox 360 |
For the past four years Mr Moore has been the public face of Microsoft's Xbox and PC gaming business, and oversaw the launch of the Xbox 360.
He will join Electronic Arts as the head of its sports games division which makes some of its most popular titles.
He will be replaced by Don Mattrick, a former EA senior executive who has worked as a consultant at Microsoft.
The news about Mr Moore comes only weeks after Microsoft announced it would be spending $1.15bn to fix faulty Xbox 360 consoles.
Microsoft said nothing should be read into the timing of Mr Moore's departure.
On joining the game firm Mr Moore will receive a $1.5m golden handshake to offset future bonuses he was due from Microsoft.
At the EA division he will oversee the development of popular game franchises such as Madden NFL football, NBA Live and Fifa Soccer. About one-third of EA's revenue comes from sales of sports-related games.
Mr Moore, a Liverpudlian, joins EA shortly after a major re-organisation that saw it split into four divisions in a bid to become more competitive. In its last quarter, EA reported losses of $25m.
Before joining Microsoft in 2003, Mr Moore was president of Sega America and prior to that head of marketing at Reebok International.
He is scheduled to join EA Sports in September whilst Don Mattrick will be on Microsoft's fulltime payroll in August.How to Find A Cheap Digital Camera
Advances in modern science ushered in a wave of new technology that the world can enjoy. In the old days, photographers used actual bulbs for camera flash. Only photographers carry cameras because lugging them around isn’t really fashionable at that time. Discoveries and development of cameras produced the digital camera, wherein taking pictures isn’t so cumbersome anymore. It is less expensive because you can see the images before printing them so you could choose what to actually print. The images could also be uploaded to your computer for storage and further manipulations. These days, it not unusual to carry a compact digital camera. It’s perfect for capturing those random wacky moments with your friends.
The only problem it seems is finding a cheap digital camera. Can you even find one? Because of the features that are offered by digital cameras, they are often costly. That is enough to make any person have second thoughts about buying one. But for someone who considers photography a passion, affordability is relative. You just have to set a budget before buying a camera. You can find a cheap digital camera that is just right for you if you look hard enough. Don’t buy one that is more than you can afford, even if it has a lot of features. Make sure that you can actually use these features so that you can get your money’s worth. Consider your lifestyle and your objectives. Do you plan to spend a lot of time taking pictures or do you just want something small that can fit in your bag? If you are still a beginner, don’t buy a high-end professional camera just for the assurance of image quality and zoom performance. Instead, buy a cheap digital camera that is compact but has powerful features and easy to carry around. Explore the basics before cashing out on expensive professional cameras.
Though these are very important points to ponder when buying a cheap digital camera, you also have to consider the performance and features of your camera. Check the megapixels, zoom capability, image quality, type of media and battery. These are actually features that digital cameras highlight in advertisements.
· Megapixels – They are not the be-all and end-all of digital cameras. Salespeople like to throw this information to you because it promises clearer images. But this is just one factor that comprises your digital camera. You have to check the megapixels’ quality. Most image sensors can only find certain hues like red, blue and green. They can’t detect all three at the same time.
· Zoom capability – You’ve seen advertisements like 10x digital zoom or 5x zoom capability. While it is true, advertisers often forget to highlight optical zoom, which is actually more important. The difference is that with digital zoom, your image gets broken into small pixels if you enlarge it on your computer. If your camera has high optical zoom, you would not see pixelated images if you enlarge the image.
· Image quality – Check the quality of your image after you take a picture. Is it fuzzy or pixelated? Sharpness of colors is very important.
· Type of media – This is the memory of your digital camera. Find a memory card or stick that is compatible with your other equipment so it is easier to upload your images.
· Type of battery – See to it that your cheap digital camera doesn’t require expensive batteries or that it allows rechargeable batteries.
These points would help you decide on what kind of cheap digital camera to buy. Like what was mentioned earlier, affordability is relative for the passionate photographer. Find your niche by choosing the best but cheap digital camera.
Microsoft Photo Technologies Aim Big
Developers who work in the company's research arm showed off the technologies on Tuesday during the Microsoft Research Faculty Summit in Redmond, Washington .
HD View is one photo project that definitely has the "wow" factor.
The technology allows users to combine hundreds of photos to create one massive picture that users can zoom in on to see clear details. In one example, a panoramic photo of the city of Seattle includes 800 images, each 8 megapixals in size, stitched together to create a 3.6 billion-pixel image.
On a computer screen, it looks just like a panoramic photo. So, what's the point of combining so many photos? The massive file includes incredible detail.
Michael Cohen, a researcher at Microsoft working on the project, zoomed in to the roof of a building where a clay owl peers around a corner. With the picture zoomed out, a viewer doesn't even see a pin prick in the spot where the owl sits.
Another large photo of a mountain in Canada looks like a standard nature snapshot. But Cohen zoomed in to discover that climbers are scaling the rock wall. After finding the first climber, he followed the climbing ropes up to find the second one above him on the wall. When the photo is zoomed out, it's hard to imagine there might be climbers on the wall.
Microsoft offers the tool to build HD View photos for free on its Web site. Creating an HD View panorama image, however, isn't for just anyone. Such images are quite large and may require special cameras.
Another Microsoft project, unveiled last year and built in collaboration with the University of Washington , collects images of a site such as Rome's Trevi fountain from public photo-sharing Web pages such as Flickr . The Photo Tourism technology combines the photos into a 3D image so users can look at the object from any view. The idea was to take advantage of the potentially billions of images that are online, said Noah Snavely, a researcher at the University of Washington who works on the project with Microsoft researchers.
Microsoft also demonstrated at the summit some experiments with Virtual Earth. Eyal Ofek, a Microsoft researcher, demonstrated a 3D map of San Francisco that is made up of 10 million images, including 50,000 aerial photographs as well as pictures taken at street level. All the photos are stitched together so a user can navigate from a bird's-eye view seamlessly down to street level. The view is different from the street view capability in Google Maps , which doesn't combine the street-level pictures with aerial shots.
About 800 workers are developing projects at Microsoft Research. Some technologies they develop may become commercial Microsoft products, and others could be sold to other companies. The summit, which ended Tuesday, was an opportunity for Microsoft and its partners in academia to show off some of their projects.
In Battle of Consoles, Nintendo Gains Allies
Inspired by the early success of the Wii, the companies that create and distribute games are beginning to shift resources and personnel toward building more Wii games, in some cases at the expense of the competing systems: the PlayStation 3 from Sony and Xbox 360 from Microsoft.
The shift is closely watched because consumers tend to favor systems that have many compelling games. More resources diverted to the Wii would mean more games, and that would translate into more consumers buying Wii consoles later.
Jon Goldman, chairman and chief executive of Foundation 9 Entertainment, an independent game development company, said that he was hearing a growing call for Wii games from the publishers and distributors that finance the games that his firm creates. “Publishers are saying: Instead of spending $15 million or $20 million on one PS3 game, come back to me with five or six Wii pitches,” he said.
“We had one meeting two weeks ago with a publisher that was asking for Wii games,” said Mr. Goldman, who declined to identify the video game publisher that he met. “Three or four months ago, they didn’t want to hear Word 1 about the Wii.”
Nintendo said that titles would be coming from several major developers, like Activision and Ubisoft, that are making an enhanced commitment to the platform.
The interest in the Wii follows a period of uncertainty about the console by developers and publishers. They were initially cautious because the Wii was less technologically sophisticated, and they worried that consumers would not take to its unorthodox game play, which uses a motion-controlled wand that players move to direct action on the screen. For example, to serve balls in the tennis game, players circle their arms overhead as they would in real tennis.
History gave developers and publishers reason for caution, too. Nintendo’s last system, the GameCube, was initially a hot seller, but was ultimately outsold — and by a considerable margin — by the PlayStation 2 and Xbox. Also, Nintendo has historically made many of the popular games for its own systems, in a way that has discouraged heavy participation by other developers and publishers.
The shift does not represent any shunning of the Xbox or Sony consoles, but rather an elevation of the Wii’s status — one that was clear in many conversations with developers and publishers at E3, the video game industry’s annual trade show in Santa Monica, Calif.
It is early in the current console product cycle, given that these machines are intended to be on the market for more than five years. Industry analysts say they do not expect to declare a victor anytime soon. Nevertheless, the trend is clear: Nintendo is getting growing support from game developers.
“We’re seeing a big shift at E3,” said John Davison, editorial director of 1UP Network, a network of video game Web sites and magazines, “and we’ll see more later this year.” He said he was seeing some game publishers putting less emphasis on the PlayStation 3. “But they’re not going to talk about that,” he added.
Since its first appearance in stores in November, the Wii has been outselling the Xbox 360 and PS3, which came out the same month, and it continues to be in short supply. The NPD Group, a market research firm, reported that as of May, Americans had purchased 2.8 million Wii systems, compared with 1.4 million PS3s. About 5.6 million Xbox 360 consoles have sold, but it hit the market a year earlier.
The Wii has clearly benefited from a price advantage; it costs $250, compared with $300 for the least-expensive Xbox 360 and $479 for the top-of-the-line machine. The PS3 sells for $500, after a price cut by Sony to clear inventory in advance of the Christmas selling season, when its new $600 device will be offered. Microsoft has been hampered of late by widespread product failures, and the company said it would spend $1.15 billion to repair individual machines.
While the growing size of the Wii’s customer base is attractive, developers are favoring Wii for other reasons. They are able to create games in less time than is needed for rival systems, because Wii’s graphics are less complex.
Colin Sebastian, a video game industry analyst with Lazard Capital Markets, said that in rough terms, it cost around $5 million to develop a game for the Wii compared with $10 million to $20 million to make a game for the Xbox 360 or PS3. Mr. Sebastian said that given the cost differences, a developer would need to sell 300,000 copies of a Wii game to break even, compared with 600,000 of a game for the PS3 or Xbox 360.
“Wii development costs certainly are cheaper than the other consoles,” said Scott A. Steinberg, a vice president for marketing at the game developer Sega of America. The company has a number of original Wii projects under development and uses 15 to 25 programmers to develop a Wii title, compared with 50 or more for a PS3 or Xbox 360 game.
Because of its simpler graphics, development times for Wii games are also shorter. A Wii game can be created in as little as 12 months, said Kelly Flock, executive vice president for worldwide publishing at THQ, a video game developer based in Agoura Hills, Calif. Games for the two competing consoles typically take two to three years.
He said that the budget for a Wii game ranges from $1.5 million to $4 million, compared with the $10 million to $12 million the company spends on a PS3 or Xbox 360 game.
“The Wii is a godsend,” Mr. Flock said. “We are aggressively looking for more Wii titles.”
By this holiday season, Nintendo will have added 100 games to its existing 60 titles. Sony has said that it will double the number of titles for the PS3 to 120 by the end of March, while Microsoft said it would have 300 titles for the Xbox 360 by the Christmas selling season. “I don’t think you’ll see any big shifts to one platform because you’re supporting so many,” said Kathy Vrabeck, president of the casual entertainment division of Electronic Arts. That said, she added that there had been a clear shift in mood at the company toward the Wii.
“There is a clear sense of excitement about the Wii at E.A.,” she said.
George Harrison, Nintendo’s senior vice president for marketing, said, “Electronic Arts is doing much more for us than they have in the past.”
Sony counters that, to some extent, Wii developers, publishers and game players will get what they pay for: games with less-complex graphics.
“There is some truth to the fact that you can make games for Wii for less than the PS3,” said Peter Dille, senior vice president for marketing at Sony. “But we still believe that our job is to develop big-budget games.”
5 Ways To Improve Your Adsense Earnings
The ones who have been there and done it have quite some useful tips to help those who would want to venture into this field. Some of these tips have boosted quite a lot of earnings in the past and is continuously doing so.
Here are some 5 proven ways on how best to improve your Adsense earnings.
1. Concentrating on one format of Adsense ad. The one format that worked well for the majority is the Large Rectangle (336X280). This same format have the tendency to result in higher CTR, or the click-through rates. Why choose this format out of the many you can use? Basically because the ads will look like normal web links, and people, being used to clicking on them, click these types of links. They may or may not know they are clicking on your Adsense but as long as there are clicks, then it will all be for your advantage.
2. Create a custom palette for your ads. Choose a color that will go well with the background of your site. If your site has a white background, try to use white as the color of your ad border and background. The idea to patterning the colors is to make the Adsense look like it is part of the web pages. Again, This will result to more clicks from people visiting your site.
3. Remove the Adsense from the bottom pages of your site and put them at the top. Do not try to hide your Adsense. Put them in the place where people can see them quickly. You will be amazed how the difference between Adsense locations can make when you see your earnings.
4. Maintain links to relevant websites. If you think some sites are better off than the others, put your ads there and try to maintaining and managing them. If there is already lots of Adsense put into that certain site, put yours on top of all of them. That way visitor will see your ads first upon browsing into that site.
5. Try to automate the insertion of your Adsense code into the webpages using SSI (or server side included). Ask your web administrator if your server supports SSI or not. How do you do it? Just save your Adsense code in a text file, save it as “adsense text”, and upload it to the root directory of the web server. Then using SSI, call the code on other pages. This tip is a time saver especially for those who are using automatic page generators to generate pages on their website.
These are some of the tips that have worked well for some who want to generate hundreds and even thousands on their websites. It is important to know though that ads are displayed because it fits the interest of the people viewing them. So focusing on a specific topic should be your primary purpose because the displays will be especially targeted on a topic that persons will be viewing already.
Note also that there are many other Adsense sharing the same topic as you. It is best to think of making a good ad that will be somewhat different and unique than the ones already done. Every clickthrough that visitors make is a point for you so make every click count by making your Adsense something that people will definitely click on.
Microsoft Research Explores Location Technologies
In one project, the researchers lent out cheap GPS (Global Positioning System) devices to drivers and asked them to leave the devices on the dashboards of their cars for a couple of weeks, said John Krumm, a researcher at Microsoft Research . He discussed the results of his work at the Microsoft Research Faculty Summit in Redmond, Washington , on Monday.
Krumm's group examined the data they collected from the GPS units for a number of different factors, including what time of day people were most often in their cars and where they most commonly were going at what times, such as to commercial or residential areas.
That data was perhaps most relevant to the group's efforts to create a model to predict where and when users would stop and get out of their cars. Krumm imagined a number of reasons why that information might be useful. For example, the provider of a navigation system might be able to predict that because a user is near the airport, the user is likely to go there, and so offer the user a coupon for airport parking.
An intern on Krumm's team is working on determining whether hybrid cars could use such predictive modeling, which could predict the length of a trip as well as hills and the speed of the vehicle during the trip, in order to efficiently allocate the car's resources.
Scott Counts, another researcher, is working on a community application that would let fitness enthusiasts share exercise routes. SlamXR users carry a small device that includes a range of sensors such as heart rate monitor, temperature sensor, altimeter, GPS receiver and Bluetooth radio. Users of the device can collect data along their favorite bike route, for example, and upload that data onto the Web site. The site shows the route on the map and includes data such as speed and altitude along the route. Other users can search for routes based on difficulty, distance, target heart rate, elevation change and activity. Users can also tag the routes for easier searching.
Technologies developed at Microsoft Research could become commercial products from Microsoft , or the company may sell them to external sources. More than 700 researchers work in the group in five labs around the world.
The event in Redmond is an opportunity for Microsoft Research workers to spend time with members of academia, in part to discuss issues in computer science research.
Google cookies will 'auto delete'
They will be deleted unless the user returns to a Google site within the two-year period, prompting a re-setting of the file's lifespan.
The company's cookies are used to store preference data for sites, such as default language and to track searches.
All search engines and most websites store cookies on a computer.
Currently, Google's are set to delete after 2039.
Peter Fleischer, Google's global privacy counsel, said in a statement: "After listening to feedback from our users and from privacy advocates, we've concluded that it would be a good thing for privacy to significantly shorten the lifetime of our cookies."
He said the company had to "find a way to do so without artificially forcing users to re-enter their basic preferences at arbitrary points in time."
So if a user visits a Google website, a cookie will be stored on their computer and will auto-delete after two years. But if the user returns to a Google service, and each time the user returns, the cookie will re-set for a further two years.
Privacy campaigners
Privacy campaigners want to give users more control over what the search giant holds on to and for how long.
Google has pointed out that all users can delete all or some cookies from their web browser manually at any time and control which cookies from which websites are stored on a computer.
There are also tools online which can prevent the company and other firms leaving cookies on a computer.
In recent months, it has introduced several steps to reassure its users over the use of personal information.
In March the search giant said it would anonymise personal data it receives from users' web searches after 18 months.
The firm previously held information about searches for an indefinite period but will now anonymise it after 18 to 24 month
None of the other leading search engines have made any statements over anonymising IP addresses or shortening cookie lifespan.
Google Engineer Reveals New Tag & Best Strategies for Getting Indexed
We love it when Google Engineers spill the beans... on just about anything Google especially when it comes to revealing juicy secrets on improving your ranking position in Google. Or better yet ways that they recommend getting your site indexed that are not your typical suggestions.
A recent thread on SEW is talking about a post on high rankings that covers what Google Engineer, Dan Crow, Director of Crawl Systems at Google has to say about getting your site lovingly indexed by the most popular search engine.
Some juicy tidbits from Dan:
New "unavailable_after" Tags - This little gem will allow webmasters the ability to tell Google when to stop indexing a page at a certain time. For example this might be useful for people with ecommerce sites with lots of coupons that have expired, older news items, and just about anything that is temporary and not permanent in nature on a site. While I really like this planning type tag, I don't see how much better it would be than just disallowing the page, or using a META robots tag to tell Google. Or better yet, using Webmaster Central URL removal feature. I see this tag probably having wide usage on news sites where there is a large number of pages that would need to expire at certain times.
Nosnippet & Noarchive tags - He details that these tags are not generally recommended, because he says "snippets are extremely helpful to visitors, as is showing the cache". Essentially these tags eliminate some problems associated with Google caching and improperly displaying the snippets below the titles in the search results. Google is fine with their use but would rather you not. This we know but its good to hear it again.
Avoid Walled Gardens - Dan used this term from the HR article and I thought it a nice way to explain how a group of pages are linked only to each other and not to anywhere else. He said you could put one of the links from that group in a sitemap and Google would index it and follow the other links. I think pointing an external link to those pages would be a much better idea. He says "those pages would be likely to be indexed via the sitemap...but considered low quality since they wouldn’t have any PageRank. Google is working on a way to change this in the future." Interesting.
How to Remove Spyware From Your PC
How do you know whether your PC has an active spyware infestation? Slower-than-normal performance is the most common symptom people report, but such behavior can also be due to any number of factors unrelated to spyware, such as running too many applications with too little system memory, having a full or very fragmented hard drive, or running buggy software that fails to free up the memory it uses after you close the application. Your first task is to determine whether you have a spyware-related problem or just a slow machine.
- Microsoft's Malicious Software Removal Tool. This program is updated monthly, so always download the latest version before you use it.
- Microsoft's Windows Defender. Windows Vista has Defender built-in, but if you suspect that you have spyware on your PC, update the program so it can find the newest bad stuff.
- Avira Antivir PersonalEdition Classic, a free antivirus program--if you don't already have up-to-date antivirus software.
One of these three programs should detect and remove any spyware on your PC. In the unlikely event that you have picked up a brand-new specimen that isn't yet included in the antispyware databases, you'll have to do some cyber-investigating to find and eject the interloper.
If the program you want to remove from your PC doesn't have an entry in Windows' Add/Remove Programs applet in Control Panel, it has probably changed your Registry to make itself difficult to find and eradicate.
The nastiest spyware specimens--the worst of the worst--are rootkits. These programs hide themselves from Windows, from antispyware tools, and from utilities such as Process Explorer and Security Task Manager. If you suspect that a rootkit has invaded your PC, you still may triumph. A free utility called IceSword can find and remove many kinds of rootkits. The only downside (for all but about 1 billion of us)? The tool's instructions are in Chinese.