Nuclear Energy - Its Future in the US

Advances in nuclear technology , the rising prices of oil and gas and increasing dependence on foreign oil has set the US lawmakers thinking about whether the country's atomic energy program needs a re-look. Added concerns about global warming and resultant climate change due to use of conventional fuels has added some urgency to the matter.

There is no doubt that human progress depends on the availability of cheap and abundant energy. Conventional energy sources are not only going to run out in the foreseeable future but will also be responsible for severe ecological damage with yet unknown consequences for the human race. New technologies like solar power and using hydrogen as fuel are still in a nascent stage and far from perfect.Moreover solar energy is very costly. Indeed, presently it is difficult to imagine that these would be in a position to displace conventional fuels in the foreseeable future.

Given this scenario it is but natural that nuclear energy is given a second look. Not only are raw materials and technology available, but they are available on a large enough scale to make it a sustainable alternative.It is estimated that using breeder reactors we can have plenty of energy for some billions of years.

So, what is holding the US back from pursuing a more aggressive nuclear energy program. The cost for are one. A single nuclear power plant costs several billion dollars to build. Then there is the security angle. A nuclear power plant is very vulnerabe to enemy attack and also to natural disasters.Given the rise of global terrorism this risk is magnified several times over.The consequences of a "kamikaze"type attack would be catastrophic. Finally there is the problem of dealing with "spent fuel."Reprocessing of spent fuel is discouraged by many countries in the mistaken belief that it will prevent nuclear proliferation. It is very clear now that with the dissemination of technical know how many countries are capable of building a bomb. This raises the question as to whether or not advanced nuclear powers such as the US should actively involve themselves in their peaceful nuclear nuclear programs. Not only would this ensure the use of "safe technology "but may also persuade the nations to abandon their military nuclear programs. Several nations including the US are meanwhile considering storing nuclear waste in deep underground shelters.

No matter what the present concerns might be, nuclear energy offers advantages that are difficult to overlook and it is only a matter of time before US lawmakers come out with a program to support its development.

"Stun Gun"- Science Fiction or Reality

Anyone who has watched Star Trek or any other science fiction movie could not have but been impressed by the small & highly effective Stun Guns used by the participants.It appears to the watchers that one doesn't even have to take aim. Just point it in the general direction and shoot and that is the end of the enemy.

Well , is it just pure science fiction or is there an element of reality in it.

For your information it is real. There are a several kinds of electroshock weapons that temporarily disable a person with electric shock. Some require physical contact while others are effective even from a distance.

The principle of operation is that it uses a temporary high voltage, low-current discharge to overcome the body's muscular mechanisms.The recepient feels great pain & can be temporarily paralyzed. But since the amount of current is low it is supposed to be safe,that is, it won't kill.The subject will experience pain, muscular contraction,dizziness and collapse if exposed to it for a long time.

Presently in use by police forces around the world are Electric Shock prods which need physical contact with the subject. Other weapons are Tasers & even long range wireless electric shock weapons are available.

Although they are safer to use than guns which fire bullets, these weapons can be extremely dangerous if the subject suffers from some medical condition. They are also not very safe during practice sessions, while being used on minors and when there are inflammable liquids around. They have also not proved to be very effective against physically strong and determined assailants as they need a bit of time to act.

Nevertheless given the rising number of police shootings in the US development of this technology raises hopes of an effective enforcement system where violence and fatalities will be minimized.

IBM triples performance of World's Fastest Computer and breaks the "Quadrillion" Barrier

he world of computing continually throws up feats that are difficult to comprehend. If the world’s fastest car or world’s tallest building were suddenly to be outperformed by a factor of three, we’d be incredulous, yet such quantum leaps have become routine in the world of computing. IBM’s new Blue Gene/P is the second generation of the world's most powerful supercomputer. It triples the performance of its predecessor, Blue Gene/L while remaining the most energy-efficient and space-saving computing package ever built. Blue Gene/P scales to operate continuously at speeds exceeding one petaflop (one-quadrillion operations per second) and can be configured to reach speeds in excess of three petaflops. The system is 100,000 times more powerful than a home PC and can process more operations in one second than a stack of laptop computers 1.5 miles high (don’t try this at home folks).

The result is a machine that towers over other systems. It enables science and commercial supercomputing to attack vital problems in ways never before possible -- modeling an entire human organ to determine drug interactions, for example. Drug researchers could run simulated clinical trials on 27 million patients in one afternoon using just a sliver of the machine's full power.

IBM researcher Shawn Hall inspects a new Blue Gene/P supercomputer. The IBM system will be capable of up to three thousand trillion calculations per second.

"Blue Gene/P marks the evolution of the most powerful supercomputing platform the world has ever known," said Dave Turek, vice president of deep computing, IBM. "A new group of commercial users will be able to take advantage of its new, simplified programming environment and unrivaled energy efficiency. We see commercial interest in the Blue Gene supercomputer developing now in energy and finance, for example. This is on course with an adoption cycle -- from government labs to leading enterprises -- that we've seen before in the high-performance computing market."

A Green Design Ahead of its Time

The Blue Gene supercomputer line was born from a visionary IBM initiative to develop a hugely scalable and highly reliable scientific computing platform. With Blue Gene, designers sidestepped two key constraints on state-of-the-art supercomputing -- power usage and space requirements. The Blue Gene supercomputer was purpose-built to fit in smaller spaces and use less electricity compared to other commercially available designs. Today, the Blue Gene/P supercomputer is at least seven times more energy efficient than any other supercomputer.

The influence of the Blue Gene supercomputer's energy-efficient design and computing model -- once considered exotic -- can be seen everywhere in the industry where people have attempted to lower energy use and get performance without traditional reliance on chip frequency. The breakthrough BlueGene supercomputer design uses many small, low-power embedded chips each connected through five specialized networks inside the system.

Some of the world's leading research laboratories and universities have already placed orders for Blue Gene/P supercomputers. The U.S. Dept. of Energy's Argonne National Laboratory, Argonne, Ill., will deploy the first Blue Gene/P supercomputer in the U.S. beginning later this year. In Germany, the Max Planck Society and Forschungszentrum Julich also plan to begin installing Blue Gene/P systems in late 2007. Additional Blue Gene/P system rollouts are being planned by Stony Brook University and Brookhaven National Laboratory in Upton, N.Y., and the Science and Technology Facilities Council, Daresbury Laboratory in Cheshire, England.

"We view the installation of the Blue Gene/P system as the next phase of a strategic partnership furthering advances in computation in support of breakthrough science," said Robert Rosner, director, Argonne National Laboratory.

At FZ Julich, where researchers have been using a Blue Gene/L machine for two years, a Blue Gene/P system will allow for more breakthrough science -- in such areas as particle physics and nanotech, for example -- while keeping the research facility within acceptable power budgets. "The big computing power at low electricity rates allows us to boost the performance of very complex and computationally intensive algorithms," said Thomas Lippert, director of the supercomputing center at FZ Julich.

Inside the Fastest Computer Ever Built

Like its predecessor, the Blue Gene/P supercomputer is a modular design, composed of "racks" that can be added as requirements grow.

Four IBM (850 MHz) PowerPC 450 processors are integrated on a single Blue Gene/P chip. Each chip is capable of 13.6 billion operations per second. A two-foot-by-two-foot board containing 32 of these chips churns out 435 billion operations every second, making it more powerful than a typical, 40-node cluster based on two-core commodity processors. Thirty-two of the compact boards comprise the 6-foot-high racks. Each rack runs at 13.9 trillion operations per second, 1,300 times faster than today's fastest home PC.

The one-petaflop Blue Gene/P supercomputer configuration is a 294,912-processor, 72-rack system harnessed to a high-speed, optical network. The Blue Gene/P system can be scaled to an 884,736-processor, 216-rack cluster to achieve three-petaflop performance. A standard Blue Gene/P supercomputer configuration will house 4,096 processors per rack.

For Programmers, Friendlier Interfaces & Application Compatibility Speed Productivity

There are some key differences between Blue Gene/L and Blue Gene/P supercomputers. In hardware, the Blue Gene/P supercomputer moves to more (four vs two) and speedier (850 MHz vs 700 MHz) processors per chip; more memory and an SMP mode to support multi-threaded applications. This new SMP mode moves the Blue Gene/P system to a programming environment similar to that found in commercial clusters. The Blue Gene/P supercomputer dramatically scales up collective network performance to minimize common bottlenecks inherent in large parallel-computing systems. Software marks the third key upgrade for the Blue Gene/P solution -- system management, programming environment and applications support have all been refined in Blue Gene/P.

In Germany, a Blue Gene/P supercomputer will become the platform for new applications scaled for petaflop-level performance at the Max Planck society. "The next-generation Blue Gene system will improve our capacity to prepare, develop and optimize applications from the Max Planck Society for future peta-scale computing," said Hermann Lederer, head of application support at Max Planck's RZG/Garching Computing Center.

The Blue Gene supercomputer operating system is based on the open-source Linux operating system. Applications are written in common languages such as Fortran, C and C++ using standards-based MPI communications protocols. The Blue Gene/P supercomputer is compatible with the diverse applications currently running on the Blue Gene/L supercomputer, including leading research in physics, chemistry, biology, aerospace, astrophysics, genetics, materials science, cosmology and seismology.

A variety of independent software vendors have plans to port existing tools and applications to the Blue Gene/P supercomputer. These include Gene Network Sciences, TotalView Technologies, Inc., Tsunami Development LLC and Visual Numerics, developers of IMSL.

A robot that walks on water

The NanoRobotics team at Carnegie Mellon University (CMU) are working on a robot that walks on water, mimicking the Basilisk, or "Jesus Lizard" that's famous for its ability to dash across a water surface on its hind legs. Researchers see amphibious potential in the water-walking robot, as well as a possible efficiency boost in comparison to a boat, because a vehicle that runs across the surface of water experiences very little viscous drag. Computer simulations have been encouraging, demonstrating a few possible efficiency gains in the design and motion over the evolutionary model provided by the Basilisk, particularly with the option of using two or more sets of running legs. Several leg designs have been tested (see one in action in this video (MP4)) but the researchers are still working on an operating prototype.

The basilisk is well known for its ability to run across the surface of a body of water at a very fast rate of up to 1.5 metres per second. Check out this amusing video of the "Jesus Lizard" in action to see how it manages this. It's distinguished from other water-riding animals and insects by the fact that it doesn't use surface tension to keep it afloat, instead elevating and propelling itself by the slapping motion of its large, webbed feet.

The Water Runner Robot is designed to operate using the same principles. The CMU NanoRobotics research team spent a lot of time studying the motions of the basilisk to learn to mimic and then optimize the water-running motion to generate enough lift and thrust to sustain and move a robot far heavier than the lizard itself.

See the full research paper here (PDF).

Fujifilm announce 12-megapixel compact digital camera

Just when it looked like the headlong rush towards more MegaPixels had slowed down in favor of greater optical zoom capabilities in the compact camera arena, Fujifilm has announced the upcoming release of the F50fd - a compact unit that boasts a staggering 12 MegaPixel image resolution and crams in a raft of cutting edge features including Dual Image Stabilization, enhanced Face Detection and improved automated flash capabilities that include a two shot flash mode which takes two shots in quick succession in low light - one with flash one without - to ensure optimum results with minimum effort and takes full advantage of its extremely sensitive ISO levels of up to 6400 (1600 at full resolution). The F50fd is among four new additions to the FinePix F series due for release in September and will retail for around US$300 - compare this to the more than $10K outlay required to purchase a 1.3 megapixel camera like the Kodak DCS-100 in the early 1990's and it's clear just how very far the industry has come in less than two decades.

The compact F50fd features a 3.0x optical zoom, with consumers seeking higher capabilities in this area having the option of another new member of the FinePix stable - the SLR-styled 8-MegaPixel S8000fd which features an impressive18x Optical Zoom. Among the key enhancements in the the new F50fd is improved Face Detection. Like previous models the camera can detect up to10 human faces in a scene and automatically identify and optimize focus, exposure and white balance to ensure that the selected face is captured in the shot. This is achieved in just 5/100's of a second no matter where the subjects are located within the frame and unlike previous versions Face Detection does not require frontal shots - it can identify profiles at up to 90 degrees and work with angles to 135 degrees if the subject is leaning forward or lying down.

The F50fd utilizes a 7th generation FinePix Super CCD-HR chip and the RP Processor II delivering optimum results in poorly lit situations with ISO settings of up to ISO1600 at full resolution and ISO 6400 at reduced resolution (3MP or lower). The "Intelligent Flash" takes advantage of the high speed capabilities by automatically adjusting its intensity to minimize wash-out and the Dual Shot mode provides further versatility by shooting two images in rapid succession - one with the flash and one without - and saves both for later comparison. Red- Eye Removal functionality has also advanced with automatic correction of red-eye immediateley after taking the shot - in this instance both the original and corrected file are saved so that comparisons can be made later.

Viewing is via a 2.7" high-resolution 230,000 pixel wide angle view LCD and up to 100 small images can be displayed on the screen at one time to speed-up the navigation process.

Usability is further enhanced by the inclusion of a Dual Image Stabilization mode - this combines a mechanically stabilized CCD sensor with high ISO sensitivities designed to deliver reduced blur from hand shake and subject movement.

The FinePix F50fd will debut in September 2007 at a retail price of US$299.95. Additional models include the FinePix S8000fd, an 8- MegaPixel, SLR-styled 18x Optical Zoom camera US$399.95, the 8-MegaPixel, 4x Optical Zoom FinePix F480 US$179.95 and the 9-MegaPixel entry level model with a 4x Optical Zoom FinePix A920 US$199.95.

Othe key specs of the FinePix F50fd include:

  • CCD Sensor: 1/ 1.6-inch Super CCD HR
  • Movie : AVI (Motion JPEG ) with sound
  • Lens: Fujinon 3 x Optical zoom lens
  • Lens Focal Length (Rated) / f=8.0mm - 24.0mm, Equivalent to (35mm Equiv.) 35-105mm on a 35mm camera
  • Aperture Range: F2.8 -F8
  • Digital Zoom: Approx 8.2x (Max)
  • Focus Auto focus: (Area, Multi, Center)
  • Color Control Modes: B/W, Standard, Chrome
  • Shutter Speed 8 sec. to 1/2000 sec. (depending on Shooting mode)
  • Power Source: NP-50 Li-ion battery (included)/ CP-50 with AC power adapter AC-5VX (sold separately)
  • Camera Dimensions: 3.6(W)× 2.3(H)× 0.9(D) in. / 92.5(W)× 59.2(H)× 22.9 (D) mm
  • Weight: (not including accessories, batteries & xD Card) Approx. 5.4 oz. / 155 g
  • Shutter Lag Time: Approx. 1/100th sec.
  • Movie Recording: 640 x 480 pixels, 320 x 240 pixels, 25 frames/sec. with monaural sound
  • Memory Type: Internal memory (approx. 25MB) / xD-Picture CardTM (16MB - 2 GB) / SD memory card (512MB to 2GB)/ SDHC memory card (4GB to 8GB )
  • Further info is available via the FujiFilm site.

    Intel accused of breaching European antitrust rules

    The European Commission has accused Intel Corp . of abusing its dominant position in the microprocessor market to exclude its rival Advanced Micro Devices Inc ., in breach of European antitrust rules.

    The Commission sent Intel a Statement of Objections on Thursday, giving it 10 weeks to reply. A Statement of Objections is a formal accusation of antitrust violations.

    Intel abused its position in three ways, according to the Commission: by offering rebates to PC manufacturers that buy the majority of their processors from Intel; by making payments to some manufacturers to encourage them to delay or cancel products using AMD processors , and by selling processors below cost when bidding against AMD for contracts with server manufacturers.

    After Intel replies to the charges, it can request an oral hearing. If the Commission remains convinced that Intel has abused its market position, it can fine the company and order it to stop the anticompetitive practices, the Commission said in a statement Friday.

    Representatives of the Commission, Intel and AMD could not immediately be reached for comment. AMD has already filed antitrust complaints against Intel in Europevarious parts of the world.

    Smart video advertising at the petrol pump – rolling out in 115 countries globally

    Pay-at-the-pump petrol is a great convenience for drivers, but it sucks profit away from the retailer, who makes the majority of his profit from convenience store sales. Petrol buyers are the perfect target for point-of-sale advertising – they’re already out of their car, and what’s another five dollars of snacks or coffee on top of a $50 fill-up? To catch more of these disposable dollars as they come through, a global partnership between three market giants is about to target every petrol customer in 115 countries with a sophisticated video advertising campaign right at the pump, and it promises to change the gas station experience for good.

    Fuel sales account for 70 percent of convenience store revenues, yet 68 percent of the industry's gross profit dollars come from in-store sales, according to a recent survey. With only 53 percent of gas customers visiting the convenience store, there’s a clear opportunity to raise profits just by getting people in the door – and a new partnership between digital merchandising specialists EK3, software giant Microsoft and petrol pump manufacturer Dresser Wayne is about to go all-out to convert gasoline buyers into convenience store customers.

    Using video screens built into the petrol pump apparatus, the partnership will display targeted video advertising to highlight in-store deals and promotions and draw people into the store. The EK3/Dresser Wayne partnership allows station owners of any size to customize promotions, offer printable coupons, build brand loyalty and engage in "day-parting" - such as promoting coffee and breakfast foods in the morning and soft drinks and snacks in the afternoon.

    The media is loaded through a simple Web interface that enables new promotions to be put together and trialed very quickly – opening up opportunities for store owners to sell advertising space to other local businesses as a further revenue stream. Owners of multiple stores and national petroleum networks can “narrowcast” their nationwide or network-wide offers with similar ease.

    Under the new partnership, the system will be rolled out in a massive 115 countries around the world through Dresser Wayne’s gas pump distribution networks. It’s already being utilized in some chains, such as BP sites and The Home Depot gasoline stations and the Army Air Force Exchange Services in the USA. It is due to launch in Europe in September.

    Research Data Drives Effective Creative Strategy

    It's that time again . . . time to get the ball rolling on your new membership recruitment campaign, or your seasonal ad campaign, or your annual meeting promotion. You need an idea, a direction, an inspiration to guide your creative mind to a result that will be executable, will reach and resonate with the intended audience, and come in within budget. Where do you turn? Hopefully, you turn to the potential customer, in the form of primary research.

    The more you know about the audience for any marketing effort, the more effective that effort will likely be. You know the challenges they face, you know the mindset they use on a daily basis, you know what they need, and can make your concepts, copy and offers sing to the audience in a way that creates action, but only if you have the information you need. The way to get that information, in a reliable way that you can use to make decisions, is to be in regular contact with the audience. One of the most effective ways to do that is with periodic in-depth phone research.

    Get a Reality Check
    In-depth phone research, when combined with some written survey work on a periodic basis, can help you get an accurate feel for your members or target audience on an ongoing basis, unfiltered by the "pick the middle choice" phenomenon of printed surveys. Done in a truly blind fashion, where the audience has no idea your organization is behind the questions, customers feel secure enough to answer honestly and directly. Even so, most respondents in a small, highly specific prospect pool, especially in a member-based organization, figure out that the word will filter back to your organization eventually, so they feel that this may be an opportunity to air their gripes and get something done on their behalf without complaining directly to you. You can also gather information on the positive side as well, as compliments are far more rare then complaints from customers or members of the organization.

    Customer service benefits aside, true primary research generates not only anecdotal information on your current customers or members, but if you include ex customers or former members in your scheme, there is quantitative data generated that can be projected accurately over the entire audience or prospect pool. And in that data is where the creative inspiration hides.

    Draw Comparisons
    Inspirational data often comes from the most unexpected numerical comparisons. Most marketing data mirrors the expectations that were built into the questions in the phone survey. In the face of that effect, there is often one set of data that stands out as an unexpected result, either very positive, or extremely negative compared to your own "feel" for that issue.

    The other comparison that lends itself to driving a creative "hook" is the comparison between the data from your current constituents and your former constituents. Not only will this comparison show you what facets of your organization are working well and retaining customers, but it will also show some of the reasons why the ex-customers left. Those are the things you can address in your creative strategy to shore up those perceptions that could be discouraging potential customers from doing business with you.

    Often an issue you feel is of little consequence turns out to mean an awful lot to the constituent audience. If you find that unexpected "key to their heart", that should inspire a creative approach that will yield considerable success. Both in the concept and in the copy, hitting that high note repeatedly based on solid research is usually a home run.

    Careful reading and interpretation of that collected data is key to going in the correct direction. Sometimes some additional follow-up research with a small but representative audience to drill down on that unexpected issue can generate some additional, more leading data. That clarification can mean the difference between a home run and a wiff.

    Occasionally, the opposite scenario plays out, and something you've been promoting as a benefit all along turns out to have little importance to the audience. That lack of "resonance" is a disconnect that you now know you can avoid in your copy. That frees up some room to play up the positive aspects you've verified with the research data.

    Use The Data You Gather
    Without the underpinnings of that research, there is little basis for decision-making in the creative process. The data can give you a more sturdy brand profile, it lets you make a persuasive case to senior management, and gives you something to backstop your creative direction. The temptation is often to take the data and twist it to meet the "gut feel" that exists in the collective mind of the organization.

    Ignore the data at your own peril. If the study is conducted by professional researchers, and there are no clear flaws in the list of respondents and its reflection of the audience is accurate, then let the data drive your decisions.

    The data doesn't lie. It's very easy to discount research data when you compare it to your own perceptions, or the preferred perception of the organization, and it doesn't match. It's tougher to stick to your guns, believe the data and act upon it. Once you see it work predictably and successfully, you learn to trust the numbers.

    Prioritize the Issues
    Once you have the data collected, and the analysis done, how do you make the leap to a creative direction? The secret is in the numbers. The basic strategy is that you determine the type of approach based on the read of the top 5 factors in the survey in order of importance. If the top three involve emotional issues, rather than the rational, or intellectual, then the creative approach leans toward a more emotional appeal.

    For example, if the survey indicates that your organization is not producing results for customers in a particular area, maybe customer service or responsiveness - those are largely emotional issues, as no one likes to feel ignored or not served adequately, but they are not functional issues or operational issues within the organization's functional mission. The creative approach in that case might involve imagery and copy that plays upon the warm, service-oriented nature of the organization, a one to one approach that is more welcoming and almost apologetic. Of course, you can also pass the info on to the customer service department and improve there operationally as well.

    If you uncover among your top five factors that numerically your satisfaction level among customers is 3 times higher than your ex-customer dissatisfaction ratio, there's a set of numbers to crow about, and you can take a more rational, numerical approach to the concept and the copy - show you're keeping customers happy and keeping them longer than ever before. The data still drives the point home, and works to provide you with a creative direction, a springboard toward a winning concept that resonates with the audience.

    Use A Metaphor
    One of the simplest ways to make the leap from data to concept is to use a metaphor that explains what the data reveals. If you're trying to illustrate that your company grew its customer base by 200% in the last quarter, or that your customer satisfaction rating improved by 3x over the last year based on some changes you've put in place, showing images of outrageous growth - beanstalks, elephants, Cyclops giants, etc.; or show images of size disparity - big bones with little dogs, big sandwiches with little kids, an Oreo cookie so large it won't go in the glass of milk. The metaphor gives you a way to explain the concept that the data revealed in a way the audience can relate to easily.

    Now, on to those meeting ads, or those membership recruitment ads. Let the data be your guide in these cases as well. If your data shows that 80% of your members don't go to your annual meeting because it's too expensive, takes too much time away from the office and the same people go every year and it's turned into a good ole' boys club, its time to break out the big guns. They are not finding the value in your meetings. Time to fight the perceptions with your own reality and show the members in your ad or brochure that there are benefits to spending the money, taking time away and meeting those good ole' boys face to face. Imagery in this case should be very rational, practical, businesslike, and copy should be extremely benefit-laden, addressing those concerns head on in a way the audience can relate to.

    In many cases, if you get one good lead, one good tip, meet one solid useful connection at a meeting, you've made the trip a worthwhile endeavor. Now multiply that by the "possibilities" of the number of typical attendees (some latitude allowed here, no accountants in the wings), and show how the value multiplies with the number of participants - sort of a "you have to show up to win type of approach".

    Destination "X"
    Ads focused on the destination are destined to fail for at least a portion of the audience, yet they persist and even proliferate in the member organization landscape. Everyone knows it's great to go to a meeting in "X" city, if you like that city, and if it has something inherently beneficial or relevant to the meeting's purpose. If not, you'll lose the folks who are farthest away and those that are the most cost conscious, almost automatically. No matter what city you pick, those two audiences are lost if the content isn't up to snuff. You can't have a meeting good enough to get them to go there. For those who are having trouble finding value in the content, the city is irrelevant. If the content is good and the results beneficial, you can have the meeting in a train station and people will attend.

    Use Testimonials
    For those organizations hunting for new members, there are many approaches where the data can give you some insights to follow. Testimonial approaches are a very strong framework from which to build value for prospective members. They humanize the organization, provide benefits the audience can relate to easily, and put a face to the issue of keeping members involved and active. Your research data sets showing the biggest challenges members or customers face are the key to crafting solid testimonials that answer these challenges.

    You can use the top 3-5 problem areas the data reveals and create a series of ads or brochure pages featuring members explaining how their involvement in the organization helped them solve the problem or meet the challenge. They would be highly credible, they would show the organization at work, and they would outline very relevant benefits that would resonate with the audience to a high degree - all driven by a few questions in your phone research survey.

    Use Everything Available
    There are many creative approaches buried within your primary research, and there are many sources of data that can be used to augment, support and reinforce your primary data and the subsequent analysis. Member application data, tradeshow or annual meeting attendee data, industry atlases or SIC code studies published by the U.S. Department of Labor, can all shed light on your target population. There are other kinds of research as well that will generate data, including focus groups, written or e-mail surveys, web surveys, live interviews at meetings or tradeshows, and live long-form personal interviews at a research facility equipped with one way mirrors and camera equipment.

    All these are viable forms of information gathering, and each has their place in providing you data you can use to form a creative approach to your outreach marketing. The key is to believe the numbers and use them in conjunction with your internal organizational knowledge to drive an effective creative strategy.

    Video Marketing For Free Traffic

    Using video marketing to drive traffic to your web site is an online marketing strategy many small business owners and internet marketers are beginning to embrace, with much success. Having a traditional website only allows you to reach those people who first find your website. However, combining video, social networking and some simple video marketing techniques can drive hordes of qualified visitors to your website.

    First, let’s forget the silly videos you’ll find all over YouTube of kids running into fences and demonstrating the newest dance move. While that kind of video can bring in hundreds of thousands of views, it won’t bring the targeted traffic you need.

    Instead, consider making a video tightly targeted towards your niche. A real estate agent might make a video introducing herself and showcasing a few of her available homes for sale. A night club might make a video “commercial” with soundbites from partygoers. A software developer might make a video demonstrating his latest application.

    Because Google and other search engines are beginning to give videos hosted on sites like YouTube preferential search engine ranking, it’s quite possible your video could end up on the first page of search results for your targeted keyphrase. This is incredibly powerful and not to be overlooked, as this is what will make your video marketing efforts well worth the time you invest in them.

    Consider that YouTube itself may not have a huge market of people looking for videos on “Oakland real estate.” But if your video titled “Oakland Real Estate” made the first page of Google search results (again, due to the preferential search results video is receiving in the search engines) you’d benefit from the hundreds of people who search for that term in Google seeing your video as the #1 result and ,in turn, watching your clip.

    With the preferential treatment videos are receiving in search results, the question then becomes, “How do I move people from watching my video on YouTube to actually visiting my website?”

    This is simple. Bribe them, at the end of your marketing video. What follows are some ideas:

    * Offer them something for free at your site (a consultation, a report, free drink, demo version, MP3 download, etc.)
    * Poll them or ask them a question they need visit your site to answer. People love to give their opinion. You can combine with the free offer, above, by giving them the freebie upon completion of the poll or question. This is invaluable for market research.
    * Leave ‘em hanging. Don’t tell the whole story on your YouTube hosted video – instead, tell them just enough to incite curiosity. Then, instruct them to visit your site for the complete story or answer.

    Each of these ideas are intentionally broad can be focused indefinitely and molded to fit your target market.

    The key here is catching your audience while they’re hot; directly after having watched our video and giving them a reason to continue on to your website. The truth is that if you don’t, most will simply click through to another video or search result. Capitalize on their attention and tell them where to go and what to do next – you’ll be surprised at how many will comply!

    Even if you’re no Spieldberg (I know I’m not) you can produce traffic sucking videos with these simple methods.

    Linux phone goes on sale

    TWO VERSIONS of a truly open Linux mobile phone, the Neo, have gone on web sale in the US, made by FIC of Taiwan.

    The site warns heavily that these phones are for developers not the general public.

    It's based on an official standard for a Linux mobile called Openmoko, although FIC does appear to be driving this standard.

    There's basically two versions of the Neo – a Base model for $300 and the Advanced version for $450. Each model comes in either black/silver or white/orange.

    They're both very carefully described as 'developer preview' phones. That means that lots of bits haven't quite been integrated yet.

    Perhaps the most important missing feature is integrated GPRS data access. The site also says that you shouldn't expect a reliable means of making phone calls, either.

    Other fairly vital bits missing including an inability to send or receive texts; proper Bluetooth integration and an ability to set network preferences.

    But – hey – this is the open source world and it shouldn't be long before people start to work out how to fix such things.

    Slightly more worrying, however, is the fact that integrated GPS (satellite) isn't mentioned although it was a major feature when the phone was first announced.

    So only the brave will be logging onto the official Openmoko site to buy one.

    Teensurance Tracks Teens on the Road

    The technology to track where a car is and how fast it is going is available, but parents of new teen drivers have not been quick to adopt it. One insurer may change that. Safeco is offering Teensurance, a GPS tracking and reporting system in the 44 states where it provides car insurance.

    With the GPS unit installed, parents and teens can set speed, distance, and time limitations and be notified via text message, email, and phone calls if and when any are crossed.

    Surveys show "every 16 year old and 17 year old thinks they are a better driver than mom and dad, but they get easily distracted," says Jim Havens, Safeco's vice president of customer solutions. With cell phones in the hands of inexperienced drivers, there are even more ways for new drivers to become distracted. A survey of more than 1,000 16- and 17-year-old drivers by AAA finds that 61 percent of teens admit to risky driving habits; of them, 46 percent say they text message while driving and 61 percent say they talk on cell phones.

    With my oldest a few years away from driving, suddenly a tool like this makes some sense—if it's used as a way to help teens monitor and adjust their driving behavior as they take to the road. If parents use it to revoke driving privileges at the first sign of a surpassed speed limit, then it's just a Big Mother tool and not very useful.

    Of course, it is easy to disable. But parents will get a message telling them when Teensurance is offline, Havens said. He says the company has anecdotal evidence that the system is helping teens become more aware of their driving behavior and adjusting it when needed. Still, there is not enough data yet to prove its effect on teens warrants lower premiums for families with teens who use the $14.99-a-month service. For the extra $15, families with teens also get the ability unlock a door remotely if keys are locked in the car and access to roadside assistance in case of an emergency.

    Congress to Examine Google-DoubleClick Deal

    Google executives are expected to be called to testify before House and Senate subcommittees about the company’s planned $3.1 billion acquisition of DoubleClick, a deal that is already facing close scrutiny from federal antitrust regulators.

    Within days of the deal’s announcement in April, companies including Microsoft, AT&T and some in the advertising industry, began to complain that the merger of Google and DoubleClick would limit competition in the online advertising market. Privacy groups, meanwhile, voiced concerns about the deal’s impact on consumer privacy. In May, the Federal Trade Commission began an investigation into the proposed merger.

    Now, a subcommittee of the Senate Judiciary Committee is planning to call a hearing to explore the antitrust and privacy issues raised not only by the Google deal but also by recent consolidation in the online advertising market, according to a person familiar with the planned hearing.

    Bobby L. Rush, the Illinois Congressman who is chairman of the House Energy and Commerce Committee subcommittee on consumer protection, said he had opened an investigation into the privacy and competition issues raised by the Google-DoubleClick deal and also planned to call a hearing.

    “There is widespread concern about the proposed merger between Google and DoubleClick that the Federal Trade Commission currently is reviewing,” Mr. Rush wrote in a letter to the commission, which is posted on his Web site. “I share these concerns and am writing to notify you that the subcommittee is considering holding a hearing when an appropriate date becomes available.”

    Without addressing the planned hearings directly, Google said in a statement that it believed that the deal would not harm competition and would withstand scrutiny.

    No date has been set for either the House or Senate hearings.

    The Google-DoubleClick deal precipitated a wave of consolidation in the online advertising industry, including Microsoft’s proposed acquisition of aQuantive, a DoubleClick rival, and Yahoo’s acquisition of Right Media, which runs an online advertising marketplace.

    But while those last two deals were quickly cleared by antitrust regulators, the Google-DoubleClick merger has drawn more intense scrutiny.

    Google, which dominates the business of placing text ads alongside search results and on sites across the Web, is expected to capture 27.4 percent of the $21.7 billion in United States online advertising in 2007, according to eMarketer, a research firm. The acquisition of DoubleClick would turn Google into a dominant player in the business of serving banners and other graphical ads that appear on Web sites.

    Xbox chief defects to games firm

    Peter Moore
    Peter Moore oversaw the launch of the Xbox 360
    Peter Moore, the head of Microsoft's gaming business, is leaving to join game maker Electronic Arts.

    For the past four years Mr Moore has been the public face of Microsoft's Xbox and PC gaming business, and oversaw the launch of the Xbox 360.

    He will join Electronic Arts as the head of its sports games division which makes some of its most popular titles.

    He will be replaced by Don Mattrick, a former EA senior executive who has worked as a consultant at Microsoft.

    The news about Mr Moore comes only weeks after Microsoft announced it would be spending $1.15bn to fix faulty Xbox 360 consoles.

    Microsoft said nothing should be read into the timing of Mr Moore's departure.

    On joining the game firm Mr Moore will receive a $1.5m golden handshake to offset future bonuses he was due from Microsoft.

    At the EA division he will oversee the development of popular game franchises such as Madden NFL football, NBA Live and Fifa Soccer. About one-third of EA's revenue comes from sales of sports-related games.

    Mr Moore, a Liverpudlian, joins EA shortly after a major re-organisation that saw it split into four divisions in a bid to become more competitive. In its last quarter, EA reported losses of $25m.

    Before joining Microsoft in 2003, Mr Moore was president of Sega America and prior to that head of marketing at Reebok International.

    He is scheduled to join EA Sports in September whilst Don Mattrick will be on Microsoft's fulltime payroll in August.

    How to Find A Cheap Digital Camera

    Advances in modern science ushered in a wave of new technology that the world can enjoy. In the old days, photographers used actual bulbs for camera flash. Only photographers carry cameras because lugging them around isn’t really fashionable at that time. Discoveries and development of cameras produced the digital camera, wherein taking pictures isn’t so cumbersome anymore. It is less expensive because you can see the images before printing them so you could choose what to actually print. The images could also be uploaded to your computer for storage and further manipulations. These days, it not unusual to carry a compact digital camera. It’s perfect for capturing those random wacky moments with your friends.

    The only problem it seems is finding a cheap digital camera. Can you even find one? Because of the features that are offered by digital cameras, they are often costly. That is enough to make any person have second thoughts about buying one. But for someone who considers photography a passion, affordability is relative. You just have to set a budget before buying a camera. You can find a cheap digital camera that is just right for you if you look hard enough. Don’t buy one that is more than you can afford, even if it has a lot of features. Make sure that you can actually use these features so that you can get your money’s worth. Consider your lifestyle and your objectives. Do you plan to spend a lot of time taking pictures or do you just want something small that can fit in your bag? If you are still a beginner, don’t buy a high-end professional camera just for the assurance of image quality and zoom performance. Instead, buy a cheap digital camera that is compact but has powerful features and easy to carry around. Explore the basics before cashing out on expensive professional cameras.

    Though these are very important points to ponder when buying a cheap digital camera, you also have to consider the performance and features of your camera. Check the megapixels, zoom capability, image quality, type of media and battery. These are actually features that digital cameras highlight in advertisements.

    · Megapixels – They are not the be-all and end-all of digital cameras. Salespeople like to throw this information to you because it promises clearer images. But this is just one factor that comprises your digital camera. You have to check the megapixels’ quality. Most image sensors can only find certain hues like red, blue and green. They can’t detect all three at the same time.

    · Zoom capability You’ve seen advertisements like 10x digital zoom or 5x zoom capability. While it is true, advertisers often forget to highlight optical zoom, which is actually more important. The difference is that with digital zoom, your image gets broken into small pixels if you enlarge it on your computer. If your camera has high optical zoom, you would not see pixelated images if you enlarge the image.

    · Image quality Check the quality of your image after you take a picture. Is it fuzzy or pixelated? Sharpness of colors is very important.

    · Type of media This is the memory of your digital camera. Find a memory card or stick that is compatible with your other equipment so it is easier to upload your images.

    · Type of battery See to it that your cheap digital camera doesn’t require expensive batteries or that it allows rechargeable batteries.

    These points would help you decide on what kind of cheap digital camera to buy. Like what was mentioned earlier, affordability is relative for the passionate photographer. Find your niche by choosing the best but cheap digital camera.

    Microsoft Photo Technologies Aim Big

    Microsoft Corp . is working on a variety of innovative photo projects, ranging from experiments with its 3D maps offering to massive panoramic photos that users can zoom into for details.

    Developers who work in the company's research arm showed off the technologies on Tuesday during the Microsoft Research Faculty Summit in Redmond, Washington .

    HD View is one photo project that definitely has the "wow" factor.

    The technology allows users to combine hundreds of photos to create one massive picture that users can zoom in on to see clear details. In one example, a panoramic photo of the city of Seattle includes 800 images, each 8 megapixals in size, stitched together to create a 3.6 billion-pixel image.

    On a computer screen, it looks just like a panoramic photo. So, what's the point of combining so many photos? The massive file includes incredible detail.

    Michael Cohen, a researcher at Microsoft working on the project, zoomed in to the roof of a building where a clay owl peers around a corner. With the picture zoomed out, a viewer doesn't even see a pin prick in the spot where the owl sits.

    Another large photo of a mountain in Canada looks like a standard nature snapshot. But Cohen zoomed in to discover that climbers are scaling the rock wall. After finding the first climber, he followed the climbing ropes up to find the second one above him on the wall. When the photo is zoomed out, it's hard to imagine there might be climbers on the wall.

    Microsoft offers the tool to build HD View photos for free on its Web site. Creating an HD View panorama image, however, isn't for just anyone. Such images are quite large and may require special cameras.

    Another Microsoft project, unveiled last year and built in collaboration with the University of Washington , collects images of a site such as Rome's Trevi fountain from public photo-sharing Web pages such as Flickr . The Photo Tourism technology combines the photos into a 3D image so users can look at the object from any view. The idea was to take advantage of the potentially billions of images that are online, said Noah Snavely, a researcher at the University of Washington who works on the project with Microsoft researchers.

    Microsoft also demonstrated at the summit some experiments with Virtual Earth. Eyal Ofek, a Microsoft researcher, demonstrated a 3D map of San Francisco that is made up of 10 million images, including 50,000 aerial photographs as well as pictures taken at street level. All the photos are stitched together so a user can navigate from a bird's-eye view seamlessly down to street level. The view is different from the street view capability in Google Maps , which doesn't combine the street-level pictures with aerial shots.

    About 800 workers are developing projects at Microsoft Research. Some technologies they develop may become commercial Microsoft products, and others could be sold to other companies. The summit, which ended Tuesday, was an opportunity for Microsoft and its partners in academia to show off some of their projects.

    In Battle of Consoles, Nintendo Gains Allies

    In the competition among the makers of video game consoles, momentum is building for the Wii from Nintendo among its crucial allies: game developers and publishers.

    Inspired by the early success of the Wii, the companies that create and distribute games are beginning to shift resources and personnel toward building more Wii games, in some cases at the expense of the competing systems: the PlayStation 3 from Sony and Xbox 360 from Microsoft.

    The shift is closely watched because consumers tend to favor systems that have many compelling games. More resources diverted to the Wii would mean more games, and that would translate into more consumers buying Wii consoles later.

    Jon Goldman, chairman and chief executive of Foundation 9 Entertainment, an independent game development company, said that he was hearing a growing call for Wii games from the publishers and distributors that finance the games that his firm creates. “Publishers are saying: Instead of spending $15 million or $20 million on one PS3 game, come back to me with five or six Wii pitches,” he said.

    “We had one meeting two weeks ago with a publisher that was asking for Wii games,” said Mr. Goldman, who declined to identify the video game publisher that he met. “Three or four months ago, they didn’t want to hear Word 1 about the Wii.”

    Nintendo said that titles would be coming from several major developers, like Activision and Ubisoft, that are making an enhanced commitment to the platform.

    The interest in the Wii follows a period of uncertainty about the console by developers and publishers. They were initially cautious because the Wii was less technologically sophisticated, and they worried that consumers would not take to its unorthodox game play, which uses a motion-controlled wand that players move to direct action on the screen. For example, to serve balls in the tennis game, players circle their arms overhead as they would in real tennis.

    History gave developers and publishers reason for caution, too. Nintendo’s last system, the GameCube, was initially a hot seller, but was ultimately outsold — and by a considerable margin — by the PlayStation 2 and Xbox. Also, Nintendo has historically made many of the popular games for its own systems, in a way that has discouraged heavy participation by other developers and publishers.

    The shift does not represent any shunning of the Xbox or Sony consoles, but rather an elevation of the Wii’s status — one that was clear in many conversations with developers and publishers at E3, the video game industry’s annual trade show in Santa Monica, Calif.

    It is early in the current console product cycle, given that these machines are intended to be on the market for more than five years. Industry analysts say they do not expect to declare a victor anytime soon. Nevertheless, the trend is clear: Nintendo is getting growing support from game developers.

    “We’re seeing a big shift at E3,” said John Davison, editorial director of 1UP Network, a network of video game Web sites and magazines, “and we’ll see more later this year.” He said he was seeing some game publishers putting less emphasis on the PlayStation 3. “But they’re not going to talk about that,” he added.

    Since its first appearance in stores in November, the Wii has been outselling the Xbox 360 and PS3, which came out the same month, and it continues to be in short supply. The NPD Group, a market research firm, reported that as of May, Americans had purchased 2.8 million Wii systems, compared with 1.4 million PS3s. About 5.6 million Xbox 360 consoles have sold, but it hit the market a year earlier.

    The Wii has clearly benefited from a price advantage; it costs $250, compared with $300 for the least-expensive Xbox 360 and $479 for the top-of-the-line machine. The PS3 sells for $500, after a price cut by Sony to clear inventory in advance of the Christmas selling season, when its new $600 device will be offered. Microsoft has been hampered of late by widespread product failures, and the company said it would spend $1.15 billion to repair individual machines.

    While the growing size of the Wii’s customer base is attractive, developers are favoring Wii for other reasons. They are able to create games in less time than is needed for rival systems, because Wii’s graphics are less complex.

    Colin Sebastian, a video game industry analyst with Lazard Capital Markets, said that in rough terms, it cost around $5 million to develop a game for the Wii compared with $10 million to $20 million to make a game for the Xbox 360 or PS3. Mr. Sebastian said that given the cost differences, a developer would need to sell 300,000 copies of a Wii game to break even, compared with 600,000 of a game for the PS3 or Xbox 360.

    “Wii development costs certainly are cheaper than the other consoles,” said Scott A. Steinberg, a vice president for marketing at the game developer Sega of America. The company has a number of original Wii projects under development and uses 15 to 25 programmers to develop a Wii title, compared with 50 or more for a PS3 or Xbox 360 game.

    Because of its simpler graphics, development times for Wii games are also shorter. A Wii game can be created in as little as 12 months, said Kelly Flock, executive vice president for worldwide publishing at THQ, a video game developer based in Agoura Hills, Calif. Games for the two competing consoles typically take two to three years.

    He said that the budget for a Wii game ranges from $1.5 million to $4 million, compared with the $10 million to $12 million the company spends on a PS3 or Xbox 360 game.

    “The Wii is a godsend,” Mr. Flock said. “We are aggressively looking for more Wii titles.”

    By this holiday season, Nintendo will have added 100 games to its existing 60 titles. Sony has said that it will double the number of titles for the PS3 to 120 by the end of March, while Microsoft said it would have 300 titles for the Xbox 360 by the Christmas selling season. “I don’t think you’ll see any big shifts to one platform because you’re supporting so many,” said Kathy Vrabeck, president of the casual entertainment division of Electronic Arts. That said, she added that there had been a clear shift in mood at the company toward the Wii.

    “There is a clear sense of excitement about the Wii at E.A.,” she said.

    George Harrison, Nintendo’s senior vice president for marketing, said, “Electronic Arts is doing much more for us than they have in the past.”

    Sony counters that, to some extent, Wii developers, publishers and game players will get what they pay for: games with less-complex graphics.

    “There is some truth to the fact that you can make games for Wii for less than the PS3,” said Peter Dille, senior vice president for marketing at Sony. “But we still believe that our job is to develop big-budget games.”

    5 Ways To Improve Your Adsense Earnings

    If webmasters want to monetize their websites, the great way to do it is through Adsense. There are lots of webmasters struggling hard to earn some good money a day through their sites. But then some of the “geniuses” of them are enjoying hundreds of dollars a day from Adsense ads on their websites. What makes these webmasters different from the other kind is that they are different and they think out of the box.
    The ones who have been there and done it have quite some useful tips to help those who would want to venture into this field. Some of these tips have boosted quite a lot of earnings in the past and is continuously doing so.

    Here are some 5 proven ways on how best to improve your Adsense earnings.

    1. Concentrating on one format of Adsense ad. The one format that worked well for the majority is the Large Rectangle (336X280). This same format have the tendency to result in higher CTR, or the click-through rates. Why choose this format out of the many you can use? Basically because the ads will look like normal web links, and people, being used to clicking on them, click these types of links. They may or may not know they are clicking on your Adsense but as long as there are clicks, then it will all be for your advantage.

    2. Create a custom palette for your ads. Choose a color that will go well with the background of your site. If your site has a white background, try to use white as the color of your ad border and background. The idea to patterning the colors is to make the Adsense look like it is part of the web pages. Again, This will result to more clicks from people visiting your site.

    3. Remove the Adsense from the bottom pages of your site and put them at the top. Do not try to hide your Adsense. Put them in the place where people can see them quickly. You will be amazed how the difference between Adsense locations can make when you see your earnings.

    4. Maintain links to relevant websites. If you think some sites are better off than the others, put your ads there and try to maintaining and managing them. If there is already lots of Adsense put into that certain site, put yours on top of all of them. That way visitor will see your ads first upon browsing into that site.

    5. Try to automate the insertion of your Adsense code into the webpages using SSI (or server side included). Ask your web administrator if your server supports SSI or not. How do you do it? Just save your Adsense code in a text file, save it as “adsense text”, and upload it to the root directory of the web server. Then using SSI, call the code on other pages. This tip is a time saver especially for those who are using automatic page generators to generate pages on their website.

    These are some of the tips that have worked well for some who want to generate hundreds and even thousands on their websites. It is important to know though that ads are displayed because it fits the interest of the people viewing them. So focusing on a specific topic should be your primary purpose because the displays will be especially targeted on a topic that persons will be viewing already.

    Note also that there are many other Adsense sharing the same topic as you. It is best to think of making a good ad that will be somewhat different and unique than the ones already done. Every clickthrough that visitors make is a point for you so make every click count by making your Adsense something that people will definitely click on.

    Microsoft Research Explores Location Technologies

    Microsoft Corp . researchers are working on a variety of location-based tools, some of which could turn into interesting commercial applications.

    In one project, the researchers lent out cheap GPS (Global Positioning System) devices to drivers and asked them to leave the devices on the dashboards of their cars for a couple of weeks, said John Krumm, a researcher at Microsoft Research . He discussed the results of his work at the Microsoft Research Faculty Summit in Redmond, Washington , on Monday.

    Krumm's group examined the data they collected from the GPS units for a number of different factors, including what time of day people were most often in their cars and where they most commonly were going at what times, such as to commercial or residential areas.

    That data was perhaps most relevant to the group's efforts to create a model to predict where and when users would stop and get out of their cars. Krumm imagined a number of reasons why that information might be useful. For example, the provider of a navigation system might be able to predict that because a user is near the airport, the user is likely to go there, and so offer the user a coupon for airport parking.

    An intern on Krumm's team is working on determining whether hybrid cars could use such predictive modeling, which could predict the length of a trip as well as hills and the speed of the vehicle during the trip, in order to efficiently allocate the car's resources.

    Scott Counts, another researcher, is working on a community application that would let fitness enthusiasts share exercise routes. SlamXR users carry a small device that includes a range of sensors such as heart rate monitor, temperature sensor, altimeter, GPS receiver and Bluetooth radio. Users of the device can collect data along their favorite bike route, for example, and upload that data onto the Web site. The site shows the route on the map and includes data such as speed and altitude along the route. Other users can search for routes based on difficulty, distance, target heart rate, elevation change and activity. Users can also tag the routes for easier searching.

    Technologies developed at Microsoft Research could become commercial products from Microsoft , or the company may sell them to external sources. More than 700 researchers work in the group in five labs around the world.

    The event in Redmond is an opportunity for Microsoft Research workers to spend time with members of academia, in part to discuss issues in computer science research.

    Google cookies will 'auto delete'

    Google has said that its cookies, tiny files stored on a computer when a user visits a website, will auto delete after two years.

    They will be deleted unless the user returns to a Google site within the two-year period, prompting a re-setting of the file's lifespan.

    The company's cookies are used to store preference data for sites, such as default language and to track searches.

    All search engines and most websites store cookies on a computer.

    Currently, Google's are set to delete after 2039.

    Peter Fleischer, Google's global privacy counsel, said in a statement: "After listening to feedback from our users and from privacy advocates, we've concluded that it would be a good thing for privacy to significantly shorten the lifetime of our cookies."

    He said the company had to "find a way to do so without artificially forcing users to re-enter their basic preferences at arbitrary points in time."

    So if a user visits a Google website, a cookie will be stored on their computer and will auto-delete after two years. But if the user returns to a Google service, and each time the user returns, the cookie will re-set for a further two years.

    Privacy campaigners

    Privacy campaigners want to give users more control over what the search giant holds on to and for how long.

    Google has pointed out that all users can delete all or some cookies from their web browser manually at any time and control which cookies from which websites are stored on a computer.

    There are also tools online which can prevent the company and other firms leaving cookies on a computer.

    In recent months, it has introduced several steps to reassure its users over the use of personal information.

    In March the search giant said it would anonymise personal data it receives from users' web searches after 18 months.

    The firm previously held information about searches for an indefinite period but will now anonymise it after 18 to 24 month

    None of the other leading search engines have made any statements over anonymising IP addresses or shortening cookie lifespan.

    Google Engineer Reveals New Tag & Best Strategies for Getting Indexed

    We love it when Google Engineers spill the beans... on just about anything Google especially when it comes to revealing juicy secrets on improving your ranking position in Google. Or better yet ways that they recommend getting your site indexed that are not your typical suggestions.

    A recent thread on SEW is talking about a post on high rankings that covers what Google Engineer, Dan Crow, Director of Crawl Systems at Google has to say about getting your site lovingly indexed by the most popular search engine.

    Some juicy tidbits from Dan:

    New "unavailable_after" Tags - This little gem will allow webmasters the ability to tell Google when to stop indexing a page at a certain time. For example this might be useful for people with ecommerce sites with lots of coupons that have expired, older news items, and just about anything that is temporary and not permanent in nature on a site. While I really like this planning type tag, I don't see how much better it would be than just disallowing the page, or using a META robots tag to tell Google. Or better yet, using Webmaster Central URL removal feature. I see this tag probably having wide usage on news sites where there is a large number of pages that would need to expire at certain times.

    Nosnippet & Noarchive tags - He details that these tags are not generally recommended, because he says "snippets are extremely helpful to visitors, as is showing the cache". Essentially these tags eliminate some problems associated with Google caching and improperly displaying the snippets below the titles in the search results. Google is fine with their use but would rather you not. This we know but its good to hear it again.

    Avoid Walled Gardens - Dan used this term from the HR article and I thought it a nice way to explain how a group of pages are linked only to each other and not to anywhere else. He said you could put one of the links from that group in a sitemap and Google would index it and follow the other links. I think pointing an external link to those pages would be a much better idea. He says "those pages would be likely to be indexed via the sitemap...but considered low quality since they wouldn’t have any PageRank. Google is working on a way to change this in the future." Interesting.

    How to Remove Spyware From Your PC

    These days it may seem as though the short list of unavoidable perils ought to be expanded to include death, taxes, and spyware. But if you ever do get infected with some nasty piece of malware, all you need to get rid of it are the right free tools, some time, and a little know-how.
    A couple of warnings first: Removing spyware is as much art as it is science. The rogues who create spyware make removing their malicious programs as difficult as they can. In addition, some types of spyware download and install additional components, often hiding pieces of code from Windows to make removal even harder. The instructions below will wipe out most forms of spyware, but your machine's infestation may resist these measures. If so, you may have to consult a professional PC repair person. Or you can start afresh by reformatting your hard drive and then reloading Windows, your apps, and your data files (browse to our article "Windows Rejuvenated" for instructions).

    Note too that if you perform certain removal steps improperly, your PC could become inoperable. Our instructions call out these danger spots, but if you don't feel confident about performing them, ask for help from a knowledgeable friend or from the experts on a spyware-removal Web forum such as TomCoyote, Geeks to Go, or SpywareInfo.
    Make Sure It's an Infection
    How do you know whether your PC has an active spyware infestation? Slower-than-normal performance is the most common symptom people report, but such behavior can also be due to any number of factors unrelated to spyware, such as running too many applications with too little system memory, having a full or very fragmented hard drive, or running buggy software that fails to free up the memory it uses after you close the application. Your first task is to determine whether you have a spyware-related problem or just a slow machine.
    Download the latest versions of these tools:

    • Microsoft's Windows Defender. Windows Vista has Defender built-in, but if you suspect that you have spyware on your PC, update the program so it can find the newest bad stuff.

    Since some spyware applications prevent you from downloading these tools, or from visiting the Web sites that host them, download the programs to another PC that you know is free of spyware. Then copy the installers to a portable USB drive, and plug that drive into the machine you suspect is infected.
    Start by running the Malicious Software Removal Tool. This program is designed to search for and destroy only a small fraction of malware, but the ones it finds are the most serious strains of spyware and virus you can get.
    If that program doesn't find anything, run the installer for Windows Defender (if it isn't already installed on your PC) and make sure that the program downloads its updates. Then click the downward-pointing arrow to the right of the word 'Scan' at the top of the Defender window and choose Full Scan. If Defender finds malware, follow the on-screen instructions to delete the harmful files. This may require one or more reboots, because some spyware won't let you uninstall it while Windows is running.
    If Defender fails to find anything, or if it finds spyware that it can't delete, it's time for a full antivirus scan. If you're using an antivirus program that is already loaded on your system, make sure that it's updated. If you're using AntiVir, run the installer, and then reboot. When AntiVir is running, you'll see an icon in your system tray showing an open umbrella inside a red square. Right-click the icon and choose Start AntiVir. Click the Start Update link in AntiVir's program window, and when the update is complete, click the Scanner tab, choose the Local Drives option in the lower pane, and press the key to begin scanning your hard drive. If it finds anything, AntiVir will pop up a dialog box. Select either Quarantine or Delete to remove the suspect files that it identifies.

    Manual Analysis
    One of these three programs should detect and remove any spyware on your PC. In the unlikely event that you have picked up a brand-new specimen that isn't yet included in the antispyware databases, you'll have to do some cyber-investigating to find and eject the interloper.
    First, examine every process running on your machine to determine whether any of them is a piece of spyware. Window's Task Manager isn't up to this job because many spyware apps specifically hide themselves from it. Fortunately, they are less skillful at hiding from the many Task Manager alternatives. Two of my favorites are Process Explorer (which is free) and Security Task Manager (which comes in free and paid versions). Currently, only Process Explorer, which is now owned by Microsoft, is compatible with Windows Vista. A Vista-compatible version of Security Task Manager is coming, according to its producer, A&M Neuber Software. Either of these programs will show you everything that's running on your PC, and will help you determine whether a particular application should be there.
    Warning: Stopping system processes and applications in this manner is risky. In some cases, if you kill the wrong program, Windows will shut down and reboot as a safety measure. While you probably won't render your system unworkable, you should back up all important documents and set a System Restore point (click Start, All Programs, Accessories, System Tools, System Restore, and follow the on-screen instructions).
    Start one of the alternative Task Managers mentioned above, and closely examine the list of running applications on your PC. You're looking for something that's either out of place or behaving oddly. If you're using Process Explorer, unzip the archive you downloaded and double-click the ProcExp.exe program. Click OK after you read the initial dialog, and you'll be presented with a color-coded list of everything that's running: Programs highlighted in pink are Windows services; those in gray-blue are applications. Right-click the bar with the column names (it's just above the list of programs), and choose Select Columns. Check the Command Line box and click OK. A new column will appear, showing you the full path to each running app.
    If you're using Security Task Manager, double-click the installer and step through the dialog boxes to complete the installation. The first time you run the program, it will take a moment to scan your PC. Unlike Process Explorer, Security Task Manager doesn't list Windows' own system processes (other than Explorer.exe) on this initial page. If you want to see those, click the Windows Processes button on the toolbar. The higher the utility's rating for a program, the more suspect it is. As you click the entries, the program tells you why it rated the selected application as it did. However, many legitimate programs engage in activities that Security Task Manager views suspiciously, so don't just assume that anything with a rating above 50 is dangerous; instead, use the rating as an indicator of what to look at first.
    Here's where it gets tedious: If you don't know what a particular program is, what it does, or where it's supposed to live on your hard drive, you'll have to do some research. Check out the list of processes that are known to be either benign or malevolent at Uniblue Systems' WinTasks Process Library. Alternatively, you can enter the filename in a search engine and look through the results for a description of the process. Some legitimate processes get a bad rap as spyware, so it's important to corroborate any negative reports you discover.
    Remove the Reprobates
    If the program you want to remove from your PC doesn't have an entry in Windows' Add/Remove Programs applet in Control Panel, it has probably changed your Registry to make itself difficult to find and eradicate.
    Enter HijackThis, a free program designed to remove Registry entries and other settings that spyware uses to take over your PC. Rather than removing the programs, HijackThis deletes the Registry entries that prevent you from deleting the software yourself. To familiarize yourself with how HijackThis works, read the Quick Start guide, but beware: HijackThis, if misused, can render your system unbootable. Be sure to proceed deliberately, and keep those essential backups close by.
    It's a good idea to consult experts before making any changes with HijackThis. To do so, run the program by double-clicking HijackThis.exe, and then click Do a system scan and save a logfile. HijackThis will make a record of everything it finds and--in a few seconds--will create a text-file report that you can post online or send to your expert. Volunteers who use the message boards at TomCoyote, Geeks to Go, andSpywareInfo will help you sort through the log if you post it to the Malware Removal message board on any of those sites.
    If you want HijackThis to dislodge a program, fill in the check box next to it and click Fix Checked at the bottom of the program window to delete the appropriate Registry entries. Then manually delete the related file. Reboot your PC into Safe Mode (press at the beginning of the reboot cycle, before the Windows logo appears), navigate to the unwanted file on your hard drive, right-click it, and select Delete. Easy as pie.
    Rid Yourself of Rootkits
    The nastiest spyware specimens--the worst of the worst--are rootkits. These programs hide themselves from Windows, from antispyware tools, and from utilities such as Process Explorer and Security Task Manager. If you suspect that a rootkit has invaded your PC, you still may triumph. A free utility called IceSword can find and remove many kinds of rootkits. The only downside (for all but about 1 billion of us)? The tool's instructions are in Chinese.
    Fortunately, some smart people have created an illustrated guide in English for using IceSword. If you're considering using the program, read this guide carefully before you begin. As with HijackThis, a wrong move can cause serious problems.

    Intel and $100 laptop join forces

    Chip-maker Intel has joined forces with the makers of the $100 laptop.

    The agreement marks a huge turnaround for both the not-for-profit One Laptop per Child (OLPC) foundation and Intel.

    In May this year, Nicholas Negroponte, the founder of OLPC, said the silicon giant "should be ashamed of itself" for efforts to undermine his initiative.

    He accused Intel of selling its own cut-price laptop - the Classmate PC - below cost to drive him out of markets in the developing world.

    "What happened in the past has happened," Will Swope of Intel told the BBC News website. "But going forward, this allows the two organisations to go do a better job and have a better impact for what we are both very eager to do, which is help kids around the world."

    Nicholas Negroponte, founder of One Laptop per Child, said: "Intel joins the OLPC board as a world leader in technology, helping reach the world's children. Collaboration with Intel means that the maximum number of laptops will reach children."

    Intel inside

    The new agreement means that Intel will sit alongside the 11 companies, including Google and Red Hat, which are partners in the OLPC scheme.

    It will also join rival chip-maker AMD, which supplies the processor at the heart of the $100 laptop.

    "Intel's apparent change of heart is welcome, and we're sure they can make a positive contribution to this very worthy project for the benefit of children all over the world," read a statement from AMD.

    Classmate PC
    Software developed for the Classmate could run on the XO

    Initially there are no plans to switch the processor to one designed by Intel. However, the servers used to back-up the XO laptops, as they are known, will have Intel technology at their core.

    Decisions about the hardware inside the XO laptop would be made by OLPC, said Mr Swope.

    "OLPC will decide about which products they choose to offer or not offer," he said.

    OLPC, however, indicated that it would consider using Intel chips in its machines in the future.

    Walter Bender, head of software development at OLPC, told the BBC News website that he believed OLPC would eventually offer different computers with different hardware.

    "I think we will end up with a family of products that run across a wide variety of needs," he said. "Intel will be part of that mix."

    Price test

    In addition, the partnership will have a practical pay off for software developers.

    "Any software you build is going to run at least on our two platforms," said Mr Swope.

    $176 breakdown

    An application developed for the XO laptop should work on the Classmate and vice versa.

    "That's the exciting thing for me," said Mr Bender.

    Currently both laptops are being tested in schools around the world. In parallel, OLPC is finalising orders for the first batch of computers.

    Participating countries are able to purchase the XO in lots of 250,000. They will initially cost $176 (£90) but the eventual aim is to sell the machine to governments of developing countries for $100 (£50).

    Intel says it already has orders for "thousands" of Classmates, which currently cost over $200 (£100).

    Like the OLPC machine, Intel expects the price to eventually fall.

    Google Wants Testers for Mobile AdSense

    Google Inc. has begun inviting mobile Web site developers to display Google ads on their sites as part of a limited beta test.

    The offer extends to the mobile environment Google's AdSense program which lets Web site developers earn revenue by placing advertisements on their sites. Google runs the backend network that places ads on the sites relevant to site content. Site owners earn revenue when visitors click on the ads.

    Sites must be written in one of three mobile markup languages including WML (Wireless Markup Language), XHTML (Extensible Hypertext Markup Language) and CHTML (Compact HTML) in order to use AdSense for Mobile, according to a Google AdSense for Mobile help page. That's because Google's crawlers must be able to read the page in those languages to determine page content and serve up relevant ads.

    A blogger at Self Made Minds said that he received an e-mail invitation from Google on Thursday night to test AdSense for Mobile.

    In a statement, Google confirmed that it is conducting a limited beta to test AdSense for mobile. The company plans to evaluate the beta and refine the product based on feedback from users, it said.

    Late last year, Google began delivering advertisements along with its mobile search results.

    Online giants like Google are increasingly interested in the potential revenue stream from mobile users. "Mobile advertising is a huge opportunity for us starting with the basic premise that there are something like 3 billion or so handsets in the world," said Dilip Venkatachari, director of product management responsible for mobile monetization efforts at Google, in a recent interview. That compares to just around 1 billion PC users on the planet, making for an even larger target market.

    No price cut for Euro PS3 console

    Sony is not cutting the price of the PlayStation 3 in Europe, but will offer free games and accessories instead.

    There is also no word of plans for an 80GB version of the console in Europe.

    Last week Sony said it would drop the US price of the 60GB PS3 by $100 (£50), and introduced a new 80GB version at the original price of $599 (£300).

    In Europe gamers will get a "starter pack" at an unchanged price of £425 for the 60GB machine, with two games and two controllers included.

    The new pack was announced at the E3 games conference in Santa Monica, in California. Sony says it is offering gamers £115 worth of added games and controller, for no extra charge.

    But the firm could face criticism from gamers for deciding not to reduce the price of the 60GB PS3 and not announcing the 80GB machine.

    Earlier this week, Jack Tretton, chief executive of Sony Computer Entertainment America, said he thought the US price cut would at least "double" the sales of PS3 in the country.

    In the UK, the PlayStation 3 remains £125 more expensive than the equivalent Xbox 360 bundle of console and games and £225 more expensive than the Nintendo Wii with two games.

    Griffin debuts Streamline armband for iPod

    Griffin Technology today announced yet another iPod in its Case Collection: Streamline for iPod is described as "the ultimate sport armband for iPod and earns its name from its low profile and clean lines that "look great wherever and however you wear it. Elegantly minimal design keeps your iPod handy and protected, without adding bulk. Streamline manages its sleek look without sacrificing functionality." It includes a clear, full-face screen protector that safeguards the face of the iPod from scratches and smudges without restricting visual access and includes a reflective trim that makes the armband more visible to traffic while out in low-lighting environments. The washable, two-way adjustable band fits any arm with a "breathable, comfortable" fit for both the iPod or iPod nano. Streamline for full-size iPod and iPod nano joins the previously-announced Streamline for iPhone. The company said a choice of colors is planned for a follow-up release, while it is immediately available basic black for $30.

    Sony, Microsoft see games key to console race

    After years of arguing over whose video game machine has the best bells and whistles, Sony Corp. and Microsoft Corp. agree the battle boils down to which one has the best games.

    "The (game) lineup is critical this holiday because it's the first time all three consoles will be in unconstrained supply at retail," Peter Moore, Microsoft's vice president of interactive entertainment, said in an interview at the E3 video game show, the industry's most important event, which officially kicked off on Wednesday.

    About a year and a half after Microsoft introduced its Xbox 360 and eight months after Sony and Nintendo introduced their next-generation consoles, the PlayStation 3 and Wii, developers are only just starting to regularly roll out the franchise titles that can make a game system a must-have. Some games are exclusive to one machine, while others are cross platform.

    Sony cut the price tag for its PS3 by 17 per cent on Monday, leaving it comparably priced to the Xbox 360. Many expected Microsoft to follow suit with a price cut on the 360, but the company on Tuesday held firm on the console's pricing for now.

    The new PS3 price tag of $500 (U.S.) is just $20 more than the priciest Xbox, which has a 120-gigabyte hard drive. When Sony announced a year ago the PS3's $600 price tag, gamers howled.

    The cost and lack of must-have games put many consumers off the PS3 despite Sony's insistence that the price was fair, considering it has a supercomputer processor, a 60-gigabyte hard drive and a Blu-ray high-definition DVD player.

    With one of the fastest price cuts in video game history, Sony admitted its bet on the extra features had not paid off.

    Still, great games could make it a formidable antagonist for Microsoft.

    "If all we had was a price move, then we should have held that until our E3 press conference," Jack Tretton, head of Sony Computer Entertainment America, told Reuters last week. "But really, we've gotten that out front, and while we think it's substantial, we think the real news is the games."

    Yet while this holiday season will be the first with all three consoles fully rolled out, some supply problems remain. A senior Nintendo executive said on Wednesday that Wii consoles, as victims of their own success, could still be scarce at the end of the year.

    SONY TRAILS

    Although Sony was the dominant player in the previous generation console battle, it has trailed Microsoft coming to market and in sales this time round.

    Microsoft said it had shipped 11.6 million consoles worldwide by the end of June, missing its target of 12 million.

    In the United States, Microsoft has sold about 5.6 million consoles, compared with 2.8 million for Nintendo Co. Ltd.'s Wii, and 1.4 million for the PS3, according to data from NPD, a market data firm.

    Microsoft's highly anticipated games — "Halo 3," "Grand Theft Auto IV" from Take-Two Interactive Software Inc. and "Madden 08" football from Electronic Arts Inc. — are all due out this year for the 360.

    Games cost as much as $20 million to develop, and many publishers wait for machine sales to ramp up in order to insure they have a critical mass of buyers.

    All the positioning over price, however, may be moot until the consoles fall to $200 or so. That's the magic point at which past console sales have exploded beyond the hard-core gamers into the mass market.

    "Where these boxes get interesting is, 80 per cent of all consoles sold are $199 or cheaper. Consumers aren't going get interested until they get to $199," Wedbush Morgan analyst Michael Pachter told Reuters last week.

    Microsoft's Moore told Reuters on Wednesday that this generation of consoles could have the longest lifecycle ever as price cuts bring in more and more customers.

    "The indications we're seeing right now with the strength of sales of the PS2 shows the classic long tail of consumers coming in at the right price point," Moore said. "Just to throw a price out there — $149 — many years from now seems like a great price point to sell millions of units to consumers coming in."