Monday, 21 November 2011

Divas Photography

Friday, 2 July 2010

Rainbows End

This is a free sci-fi Ebook that predict what the near future could be.It is a 400 pages novel and a great pleasure to read. Enjoy

Rainbows End

Tuesday, 16 June 2009

Forward with no fear

EVENTY-FOUR THOUSAND YEARS ago, humanity nearly went extinct. A super-volcano at what?s now Lake Toba, in Sumatra, erupted with a strength more than a thousand times that of Mount St. Helens in 1980. Some 800 cubic kilometers of ash filled the skies of the Northern Hemisphere, lowering global temperatures and pushing a climate already on the verge of an ice age over the edge. Some scientists speculate that as the Earth went into a deep freeze, the population of Homo sapiens may have dropped to as low as a few thousand families.

The Mount Toba incident, although unprecedented in magnitude, was part of a broad pattern. For a period of 2 million years, ending with the last ice age around 10,000 B.C., the Earth experienced a series of convulsive glacial events. This rapid-fire climate change meant that humans couldn?t rely on consistent patterns to know which animals to hunt, which plants to gather, or even which predators might be waiting around the corner.

How did we cope? By getting smarter. The neuro physi ol ogist William Calvin argues persuasively that modern human cognition?including sophisticated language and the capacity to plan ahead?evolved in response to the demands of this long age of turbulence. According to Calvin, the reason we survived is that our brains changed to meet the challenge: we transformed the ability to target a moving animal with a thrown rock into a capability for foresight and long-term planning. In the process, we may have developed syntax and formal structure from our simple language.

Our present century may not be quite as perilous for the human race as an ice age in the aftermath of a super-volcano eruption, but the next few decades will pose enormous hurdles that go beyond the climate crisis. The end of the fossil-fuel era, the fragility of the global food web, growing population density, and the spread of pandemics, as well as the emergence of radically transformative bio- and nano technologies?each of these threatens us with broad disruption or even devastation. And as good as our brains have become at planning ahead, we?re still biased toward looking for near-term, simple threats. Subtle, long-term risks, particularly those involving complex, global processes, remain devilishly hard for us to manage.

But here?s an optimistic scenario for you: if the next several decades are as bad as some of us fear they could be, we can respond, and survive, the way our species has done time and again: by getting smarter. But this time, we don?t have to rely solely on natural evolutionary processes to boost our intelligence. We can do it ourselves.

Most people don?t realize that this process is already under way. In fact, it?s happening all around us, across the full spectrum of how we understand intelligence. It?s visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity. So far, these augmentations have largely been outside of our bodies, but they?re very much part of who we are today: they?re physically separate from us, but we and they are becoming cognitively inseparable. And advances over the next few decades, driven by breakthroughs in genetic engineering and artificial intelligence, will make today?s technologies seem primitive. The nascent jargon of the field describes this as ? intelligence augmentation.? I prefer to think of it as ?You+.?

Scientists refer to the 12,000 years or so since the last ice age as the Holocene epoch. It encompasses the rise of human civilization and our co-evolution with tools and technologies that allow us to grapple with our physical environment. But if intelligence augmentation has the kind of impact I expect, we may soon have to start thinking of ourselves as living in an entirely new era. The focus of our technological evolution would be less on how we manage and adapt to our physical world, and more on how we manage and adapt to the immense amount of knowledge we?ve created. We can call it the N?ocene epoch, from Pierre Teilhard de Chardin?s concept of the N?osphere, a collective consciousness created by the deepening interaction of human minds. As that epoch draws closer, the world is becoming a very different place.

OF COURSE, WE?VE been augmenting our ability to think for millennia. When we developed written language, we significantly increased our functional memory and our ability to share insights and knowledge across time and space. The same thing happened with the invention of the printing press, the telegraph, and the radio. The rise of urbanization allowed a fraction of the populace to focus on more-cerebral tasks?a fraction that grew inexorably as more-complex economic and social practices demanded more knowledge work, and industrial technology reduced the demand for manual labor. And caffeine and nicotine, of course, are both classic cognitive-enhancement drugs, primitive though they may be.

With every technological step forward, though, has come anxiety about the possibility that technology harms our natural ability to think. These anxieties were given eloquent expression in these pages by Nicholas Carr, whose essay ?Is Google Making Us Stupid?? (July/August 2008 Atlantic) argued that the information-dense, hyperlink-rich, spastically churning Internet medium is effectively rewiring our brains, making it harder for us to engage in deep, relaxed contemplation.

Carr?s fears about the impact of wall-to-wall connectivity on the human intellect echo cyber-theorist Linda Stone?s description of ?continuous partial attention,? the modern phenomenon of having multiple activities and connections under way simultaneously. We?re becoming so accustomed to interruption that we?re starting to find focusing difficult, even when we?ve achieved a bit of quiet. It?s an induced form of ADD?a ?continuous partial attention-deficit disorder,? if you will.

There?s also just more information out there?because unlike with previous information media, with the Internet, creating material is nearly as easy as consuming it. And it?s easy to mistake more voices for more noise. In reality, though, the proliferation of diverse voices may actually improve our overall ability to think. In Everything Bad Is Good for You, Steven Johnson argues that the increasing complexity and range of media we engage with have, over the past century, made us smarter, rather than dumber, by providing a form of cognitive calisthenics. Even pulp-television shows and video games have become extraordinarily dense with detail, filled with subtle references to broader subjects, and more open to interactive engagement. They reward the capacity to make connections and to see patterns?precisely the kinds of skills we need for managing an information glut.

Scientists describe these skills as our ?fluid intelligence??the ability to find meaning in confusion and to solve new problems, independent of acquired knowledge. Fluid intelligence doesn?t look much like the capacity to memorize and recite facts, the skills that people have traditionally associated with brainpower. But building it up may improve the capacity to think deeply that Carr and others fear we?re losing for good. And we shouldn?t let the stresses associated with a transition to a new era blind us to that era?s astonishing potential. We swim in an ocean of data, accessible from nearly anywhere, generated by billions of devices. We?re only beginning to explore what we can do with this knowledge-at-a-touch.

Moreover, the technology-induced ADD that?s associated with this new world may be a short-term problem. The trouble isn?t that we have too much information at our fingertips, but that our tools for managing it are still in their infancy. Worries about ?information overload? predate the rise of the Web (Alvin Toffler coined the phrase in 1970), and many of the technologies that Carr worries about were developed precisely to help us get some control over a flood of data and ideas. Google isn?t the problem; it?s the beginning of a solution.

In any case, there?s no going back. The information sea isn?t going to dry up, and relying on cognitive habits evolved and perfected in an era of limited information flow?and limited information access?is futile. Strengthening our fluid intelligence is the only viable approach to navigating the age of constant connectivity.

Tuesday, 9 June 2009

Tech News

New antibiotics could come from a
DNA binding compound that kills
bacteria in 2 minutes June 9, 2009
A synthetic DNA binding compound
has proved surprisingly effective at
binding to the DNA of bacteria and
killing all the bacteria it touched
within two minutes, University of
Warwick researchers have found. The
compound is cylindrical in shape and
neatly fits within the major groove
of a DNA...

In Worms, Genetic Clues to
Extending Longevity
New York Times June 8, 2009
A little piece of the germline's
immortality can be acquired by the
ordinary cells of the body, and used
to give the organism extra
longevity, Massachusetts General
Hospital researchers have found. The
insulin-signaling pathway activates
a powerful gene regulator that
controls many genetic pathways,
including some that govern
metabolism. The...

Opening Doors on the Way to a
Personal Robot
New York Times June 8, 2009
PR2, the first robot able to
navigate in a building reliably and
repeatedly recharge itself, has been
developed by Willow Garage. It is
powered by several Intel
microprocessor chips and "sees" with
a combination of sensors including
scanning lasers and video cameras....

Oily fish 'can halt eye disease'
BBC News June 8, 2009
Omega-3 fatty acids (found in fish
like mackerel and salmon) appear to
slow or even halt the progress of
age-related macular degeneration
(AMD), Tufts University researchers
have found....

<------Related Company Message------------>

Consider the future... Ray2023:
Most people choose not to know if
they have a really bad gene that
they can't do anything about. This
is one place where ignorance is
still bliss. Terry2033: Not anymore.
With the more robust gene therapies
available today, there are virtually
no genetic problems that can't be
effectively erased. TRANSCEND: Nine
Steps to Living Well Forever by Ray
Kurzweil and Terry Grossman, M.D.
contains many engaging dialogues
with the authors' future avatars.


Wednesday, 3 June 2009

Instant sex change served up by video software

Changing someone's gender or race on screen traditionally requires lengthy hours in front of a make-up mirror. But new software that can take a live video feed of a person talking and make them look and sound like somebody else could change that.

Psychologists rather than moviegoers are the first to see the benefits of the new technology: putting it to use in experiments that test how a person's gender affects the body language of others.

The software was developed by computer scientist Barry-John Theobald at the University of East Anglia in the UK and Iain Matthews, formerly at Carnegie Mellon University and now at Weta Digital in Wellington, New Zealand. They were approached by psychologists at three US universities searching for a way to switch the apparent gender of volunteers talking to each other through video conference software.

At that point not even Hollywood studios had access to such technology. For example, after Oliver Reed died during filming of the 2000 movie Gladiator the studio had to re-write his character's part and use existing footage of Reed to "act out" his character's demise . But even creating that 2-minute snippet took an estimated $3.2 million and five man-years to stitch footage over the face of another actor, frame by frame.

Face off

So, having worked on processing human faces in video for many years, Theobald and Matthews set about creating software to speed up the process.

They recorded video of volunteers performing 30 different facial expressions such as frowning, smiling and looking surprised. For each expression, the positions of key facial features, such as the eyes, nose and corners of the lips, were manually labelled.

That annotated footage was used to "train" software to recognise the face of each individual featured in the set. Once trained on a person in this way, it can closely track every move of their face in video footage.

Those movements can then be transferred onto the face of another "known" person by calculating how the recipient's features need to change to take on each new expression.

Doing that and displaying the transformed face takes just 150 milliseconds, fast enough to allow a conversation over video link to continue in real time. To complete the effect, a person's voice can be manipulated to match their new face.

Gender confusion

Psychologists at the universities of Notre Dame, Indiana; Pittsburgh, Pennsylvania; and Virginia have already used the new software to test ideas about body language and gender.

Volunteers were asked to chat to one another in a video conference, but did not know if the face they saw was really that of the person they were talking with – or indeed if the other volunteer was seeing their own true face.

The results suggest that our body language during conversation is more reactive to that of others than it is to their physical appearance, says Theobald. "We've shown you can present a female as herself or as a male, and the other participant's behaviour doesn't change," he says. The results will soon be published in the Journal of Experimental Psychology.

Next, the team plans to use the system to test the effects of changing someone's race instead of their gender.

In the long term, Theobald also thinks film studios could benefit from the technique. "What we're doing is the same effect [as used in Gladiator], but in real time with no manual input." So far the software can't deal with complexities like variable lighting, he adds, but that ability can be programmed in.

Pig stem cells could make 'humanised' organs

The world's first pig stem cells have been created from porcine ear and bone-marrow cells.

Researchers at the Institute of Biochemistry and Cell Biology in Shanghai, China, say they are the first to achieve this in hoofed animals.

Induced pluripotent stem (iPS) cells have the potential to turn into all types of body tissue. The big advantage, though, is that they can be genetically manipulated in the lab, and then cloned to create animals with new traits.

By adding or deleting certain genes, for example, researchers could produce pigs whose organs can be transplanted into patients without them being recognised and rejected. Efforts to do such xenotransplants have already been under way for at least a decade, but iPS cells are easier to genetically engineer and grow in the lab than pig embryos, opening up new possibilities for xenotransplantation.

Similar species

"The pig species is significantly similar to humans in its form and function, and the organ dimensions are largely similar to human organs," says head of the research team, Lei Xiao.

"We could use these cells to modify the immune-related genes in the pig to make the pig organ compatible to the human immune system," says Xiao. "Then we could use these pigs as [sources of organs] that won't trigger an adverse reaction from the patient's own immune system."

Xiao and his colleagues say that they made the iPS cells by using a virus to load ear or bone marrow cells with special reprogramming factors. These "rewound" the cells to the embryonic-like state of iPS cells.

Swine flu protection

As well as working towards improved organs for xenotransplantation, Xiao and his colleagues intend to produce pigs that are resistant to diseases, includingswine flu. "We could do this by finding and manipulating a gene that has anti-swine flu activity, or which inhibits growth of the swine flu virus," says Xiao.

And in agriculture, pigs could be engineered to produce better, healthier meat – with less fat, for example.

Chris Mason, an expert in regenerative medicine at University College London, said the breakthrough will boost the quest for "humanised" pig organs.

"While [using pig organs] may not necessarily be the long-term solution, it may represent a major step change in the treatment of organ failure, which potentially could deliver real benefit to millions of patients within a decade."