martes, 31 de diciembre de 2013

Significant Science of 2013: Brain Mapping Gets a Big Boost

ORIGINAL: PBS
27 Dec 2013

One terabyte of data. That’s what it took for scientists to make a comprehensive 3D map of the post-mortem human brain.

That, plus ten years of research, 7,000 slices of brain tissue from a healthy 65-year-old woman, and 1,000 hours of digitization. Sound difficult? For neuroscientists, this is only the beginning of a long journey that hopes to map the cogs and gears of the mind.

The project—dubbed BigBrain—was part of the European Human Brain Project, a joint effort by Canadian and German neuroscientists. While it bears no relation to President Obama’s BRAIN Initiative, BigBrain is certainly the kind of work that could propel the audacious federal project forward. For one, scientists can use this generic model in order to see how a normal brain compares with ones afflicted by neurological conditions like Alzheimer’s or Parkinson’s. BigBrain also captured the brain in incredible detail, revealing structures once invisible to even the most advanced technologies. 
Scientists hope to use BigBrain along with other data to help map connections between different regions of the brain.

Back in June, this is how NOVA Next contributor Teal Burrell described the significance of BigBrain:

Prior to this study, MRI provided the most detailed 3D peek into a human brain. If you think of the brain as a map of a country, the resolution of MRI—about 1 millimeter—would make towns visible, but nothing smaller than that would be. BigBrain, on the other hand, “does 50 times better in each dimension than the typical 1-millimeter resolution of MRI,” says Katrin Amunts, a neuroscientist at the Institute of Neuroscience and Medicine in Jülich, Germany, and lead author of the paper. Specifically, BigBrain’s 20-micron resolution is fine enough to pick out individuals of certain types of cells, but not all; the smallest neurons in the brain are only about 10 microns across. Still, if this were a map, the level of detail provided by BigBrain greatly exceeds MRI, allowing us to see not just towns, but the houses within them.

What’s still missing from BigBrain are the connections between neurons—the techniques used for this project weren’t suitable for developing a connectome. But it can help, serving as a scaffold over which connectivity data can be overlaid.

It’s likely, too, that BigBrain will help contribute to discoveries made in the BRAIN Initiative. Like the Human Genome Project, the BRAIN Initiative will stand on the shoulders of smaller projects. It will be a while before BRAIN ramps up—President Obama requested funds starting in 2014—but in the meantime, BigBrain is certain to give neuroscientists a more intimate picture of the human mind.
Navigating through the right hemisphere of BigBrain. Explore the brain using other types of imagery, using NOVA’s "Mapping the Brain" interactive.

Tell us what you think on Twitter #novanext, Facebook, or email.

ORIGINAL: PBS
27 Dec 2013

Dragonfly-Like Lenses Grown With Liquid Crystals

Image: A magnification showing the liquid crystal “flower” with a silica bead at the center that generated the pattern. Credit: University of Pennsylvania

Move over cultured pearls: Scientists have successfully grown liquid crystal flowers with grains of sand. These structures resemble insect eyes and could be used as complex lenses.

The researchers working on new nanotech dream of a day when all the complex, tiny parts can just manufacture themselves. Getting that to actually happen is called directed assembly, and a team from the University of Pennsylvania recently made a sweet step forward.

In the past they’d tried creating nanoscale structures using microposts that acted like a trellis to direct growth, according to a university press release. This time, they used silica beads, which are basically polished grains of sand, planted in a pool of transparent liquid crystal. This time they generated patterns of petal-shaped bumps that look like flowers. Each transparent petal can function as a lens.

Physics and astronomy professor Randall Kamien, who worked on the flowers, told Gizmag’s Lakshmi Sandhana that the process was similar to making rock candy, where a stick or string acts like a seed for sugar to make crystals naturally. ”We have just done this on a smaller scale,” Kamien said, “making smaller bits of ordered material cued by smaller elements, like our silica beads.

The research was led by a team that included Kamien, chemical and biomolecular engineering professor Kathleen Stebe, professor of materials science, engineering, chemical and biomolecular engineering Shu Yang, as well as lead author, grad student Daniel Beller. They published their work in the journal Physical Review X (abstract).

You might be wondering what the big deal is about growing a bunch of tiny lenses. It might not be as wearable as cultured pearls or as edible as rock candy, but Gizmag’s Sandhana pointed out that the technique could make producing complex dragonfly-like eyes containing millions of spherical lenses easier, faster and cheaper to achieve.

Picture being able to grow compound lenses that could cover a whole surface, lenses that can heal themselves, or even biosensors that could use the lenses to collect information. All that is a long way off but the scientists did tell Gizmag they think their lenses will go into liquid crystal displays within the next decade.

Nanoflowers Grow in Tiny Garden

Professor Shu Yang also suggested that their lens construction could be incorporated into futuristic metamaterials such as an acoustically invisible cloak. Given how far we are from a real invisibility cloak, I think we’re more likely to see a prosthetic eye with nearly X-ray capabilities first, similar to Mad-Eye Moody’s in the Harry Potter series. Heck, we’ve already got Google Glass.


ORIGINAL: Discovery
by Alyssa Danigelis
Dec 27, 2013

lunes, 30 de diciembre de 2013

Ocean Sampling Day!



I've just learned that June 21st, 2014 is going to be the first-ever Ocean Sampling Day!

The Ocean Sampling Day (OSD) is a simultaneous sampling campaign of the world’s oceans and will take place on thesummer solstice (June 21st) in the year 2014. These cumulative samples, related in time, space and environmental parameters, will provide insights into fundamental rules describing microbial diversity and function and will contribute to the blue economy through the identification of novel, ocean-derived biotechnologies. We expect that these data will provide a reference data set for generations of experiments to follow in the coming decade. It could also function as starting point for regularly coordinated future OSDs.

This is exactly the type of work we hoped to prototype on our trip to the Sea of Cortez next week. It's also exactly the type of work we found out would be breaking the law. I think this is such a cool idea and project, and a great way to harness the power of the growing number of citizen explorers and scientists.



The Ocean Sampling day will take part on the 21st of June in 2014 and will be the biggest global effort in marine science that is carried out in one single day.
More information on: http://www.my-osd.org

ORIGINAL: Open ROV
by David Lang
December 30, 2013

domingo, 29 de diciembre de 2013

Gorgeous Computer-Generated Flowers Bloom: Photos


British philosopher and mathematician Bertrand Russell once said, "Mathematics, rightly viewed, possesses not only truth, but supreme beauty." One look at these computer-generated images from Daniel Brown and Russell's words come to life.

Brown, a London-based designer, programmer and artist who specializes in digital technology and interactive design uses custom algorithms to "grow" gorgeous floral artwork that will blow your mind. Here are 11 of our favorites.
Courtesy Daniel Brown


It all started in 1999, when Brown demonstrated a computer program and mathematical model that used special code to produce fractals. The resulting animations were almost hypnotic. "It was the first time I realized that non-technical people could aesthetically appreciate mathematical formulas if they saw them 'come alive,'" he said.
Courtesy Daniel Brown


Brown created the pieces in this slideshow for the Victoria and Albert Museum and the D'Arcy Thompson Zoology Museum, as well as projects for corporate clients. A swimming accident in 2003 broke Brown's spinal cord, causing paralysis. As a result, he uses a finger-splint device and a large track pad to operate a computer. Even without this added challenge, his flowers are uniquely beautiful; no two look exactly the same.
Courtesy Daniel Brown


Several years ago Brown produced a three-story-high projection of flowers for the Victoria and Albert Museum. Each petal generated contained combinations of images from the museum's textile collection. The work was named in honor of D'Arcy Wentworth Thompson, a pioneering bio-mathematician known for his 1917 book On Growth and Form.
Courtesy Daniel Brown


Last year, the D'Arcy Thompson Zoology Museum at the University of Dundee in Scotland contacted Brown after seeing his Victoria and Albert Museum work and asked him to create a piece for them. Brown said he used generative design to create the realistic flowers for this newer exhibition, which went up last spring. Each flower shape is determined by an algorithm that is then altered to take into account natural variation.
Courtesy Daniel Brown


Another mathematical formula is used to generate the color and texture applied to the shapes. Each arrangement is grown over about 50 seconds, resembling time-lapse photography that's been sped up. "After this, they fade out and another arrangement is created," he said.
Courtesy Daniel Brown


Brown's original pieces only used two-dimensional computer graphics that mimicked a 3-D look. However, in the past few years, computer technology has evolved so that he can simulate surfaces, behaviors and lighting in real time.

Sometimes Brown produces a flower that even amazes him. "I can't work out the particular parameters that would have gone into it, and am left scratching my head," he said. "Because the flowers regenerate every minute or so, it's a fleeting moment, and there is something almost poetic knowing that no one will ever see that one flower again."
Courtesy Daniel Brown


D'Arcy Wentworth Thompson was a Scottish scientist and scholar who took various natural processes such as evolution and tried to question them mathematically. He sought to discover out how differences in shape and form between two genetically related species could be mathematically modeled, Brown explained.

He also wondered about physical processes like weather, and how they could change one shape into another. Getting contacted by the D'Arcy Thompson Zoology Museum was the ultimate honor, Brown said. "I couldn't think of a more fitting thing to do for one of my scientific heroes."
Courtesy Daniel Brown


Brown's flowers are so realistic that occasionally museum visitors won't realize they're computer graphics and will insist on asking him what kind of flowers they are. Other reactions are more visceral.

"When my work was on show in the Victoria and Albert Museum, young children -- toddlers rather -- would run up to the wall it was being projected on and try and hug it," he said. "At that moment people stop seeing technology, and just see beauty."
Courtesy Daniel Brown


While he's staying quiet about plans for future art projects, Brown said he looks forward to a future when 3-D printing is refined enough to print realistic versions of his computer flowers.

Courtesy Daniel Brown


He imagines he'll be able to make ever more intricate and extraordinary flowers. "Although I was both an artist and programmer before my injury, I have switched to creating art purely with code," Brown said. "In that way I consider myself incredibly lucky. I think I had one of the only jobs in the world that could 'survive' such a life changing event as that."

To see more images, visit Daniel Brown's Flickr page.
Courtesy Daniel Brown


ORIGINAL: Discovery
by Alyssa Danigelis
Nov 21, 2013

sábado, 28 de diciembre de 2013

Organovo Announces Plans to Create World's First 3D-Printed Human Liver tissue in 2014

man-made organs, Organovo, organic 3D printer, 3D-Printed Synthetic Tissue, 3D printing, artificial organs, autodesk, bioprinting, organovo, print living tissue, printed livers, printed organs, synthetic tissue, bioengineering, Organovo 3D printed liver, Organovo liver, 3D printed liver lasts 40 days

2014 could be a landmark year for medical technology, as researchers just announced that they are close to creating the world’s first 3D printed organ tissue. San Diego-based biotech firm Organovo plans to use its bioprinting technology to successfully 3D print a liver by the end of 2014. In an interview with ComputerWorld, the bioprinting company said it has overcome a big obstacle to creating the vascular system needed to provide man-made organs with life-sustaining oxygen and nutrients.


Just like your everyday MakerBot, Organovo’s organic 3D printer lays down layers of material to form a solid entity. The major issue with fabricating human tissue thus far has been cells would literally die before the tissue made it off the printer table.

Organovo’s researchers were able to overcome this obstacle by bringing together fibroblasts and endothelial cells, which create a tiny vascular network of blood vessels. This microscopic addition allowed Organovo to build up an organ thicker than 500 microns (0.019 inches). It might seem insignificantly small, but the man-made tissue was fully functional for at least 40 days while it sat in a petri dish.

Although the liver won’t be suitable for transplants in human beings, it could be extremely effective for scientific research and drug testing. The drug testing field still uses controversialanimal testing, and fabricating 3D printed organs could be a much more humane alternative. Organovo also hopes its 3D printing technology will help reduce the exorbitant costs of drug testing.

+ Organnovo
Via ComputerWorld
Images © Organnovo

ORIGINAL: Inhabitat
by Kevin Lee,
12/27/13

IBM's Watson Gets Its First Piece Of Business In Healthcare

The old Watson that beat Ken Jennings. Now it can fit into a desk drawer. (Credit: Getty Images via @daylife)

IBM‘s Watson, the Jeopardy!-playing supercomputer that scored one for Team Robot Overlord two years ago, just put out its shingle as a doctor or, more specifically, as a combination lung cancer specialist and expert in the arcane branch of health insurance known as utilization management. 
Thanks to a business partnership among IBM, Memorial Sloan-Kettering and WellPoint, health care providers will now be able to tap Watson’s expertise in deciding how to treat patients.

Pricing was not disclosed, but hospitals and health care networks who sign up will be able to buy or rent Watson’s advice from the cloud or their own server. Over the past two years, IBM’s researchers have shrunk Watson from the size of a master bedroom to a pizza-box-sized server that can fit in any data center. And they improved its processing speed by 240%. Now what was once was a fun computer-science experiment in natural language processing is becoming a real business for IBM and Wellpoint, which is the exclusive reseller of the technology for now. Initial customers include WestMed Practice Partners and the Maine Center for Cancer Medicine & Blood Disorders.   

Even before the Jeopardy! success, IBM began to hatch bigger plans for Watson and there are few areas more in need of supercharged decision-support than health care. Doctors and nurses are drowning in information with new research, genetic data, treatments and procedures popping up daily. They often don’t know what to do, and are guessing as well as they can. WellPoint’s chief medical officer Samuel Nussbaum said at the press event today that health care pros make accurate treatment decisions in lung cancer cases only 50% of the time (a shocker to me). Watson has shown the capability (on the utilization management side) of being accurate in its decisions 90% of the time, but is not near that level yet with cancer diagnoses. Patients, of course, need 100% accuracy, but making the leap from being right half the time to being right 9 out of ten times will be a huge boon for patient care. The best part is the potential for distributing the intelligence anywhere via the cloud, right at the point of care. This could be the most powerful tool we’ve seen to date for improving care and lowering everyone’s costs via standardization and reduced error. Chris Coburn, the Cleveland Clinic’s executive director for innovations, said at the event that he fully expects Watson to be widely deployed wherever the Clinic does business by 2020.

Watson has made huge strides in its medical prowess in two short years. In May 2011 IBM had already trained Watson to have the knowledge of a second-year medical student. In March 2012 IBM struck a deal with Memorial Sloan Kettering to ingest and analyze tens of thousands of the renowned cancer center’s patient records and histories, as well as all the publicly available clinical research it can get its hard drives on. Today Watson has analyzed 605,000 pieces of medical evidence, 2 million pages of text, 25,000 training cases and had the assist of 14,700 clinician hours fine-tuning its decision accuracy. Six “instances” of Watson have already been installed in the last 12 months.

Watson doesn’t tell a doctor what to do, it provides several options with degrees of confidence for each, along with the supporting evidence it used to arrive at the optimal treatment. Doctors can enter on an iPad a new bit of information in plain text, such as “my patient has blood in her phlegm,” and Watson within half a minute will come back with an entirely different drug regimen that suits the individual. IBM Watson’s business chief Manoj Saxena says that 90% of nurses in the field who use Watson now follow its guidance.

WellPoint will be using the system internally for its nurses and clinicians who handle utilization management, the process by which health insurers determine which treatments are fair, appropriate and efficient and, in turn, what it will cover. The company will also make the intelligence available as a Web portal to other providers as its Interactive Care Reviewer. It is targeting 1,600 providers by the end of 2013 and will split the revenue with IBM. Terms were undisclosed.

ORIGINAL: Forbes
2/08/2013
IBM's Watson Gets Its First Piece Of Business In Healthcare 

World’s First Bionic, Mind-Controlled Leg Allows Amputee to Go for a Walk


Follow @techland

Researchers have come up with a new artificial leg that reads brain signals. The technology, which has been hailed as a groundbreaking medical advance, is currently in the testing phase.


ORIGINAL: Time
By TIME Video
Dec. 26, 2013

viernes, 27 de diciembre de 2013

Olympus BioScapes 2013 Winners Gallery

Thumbnail images of the Olympus BioScapes 2013 winners and honorable mentions are displayed in this gallery. In order to view a larger version of the images (or to play videos), please click on the individual thumbnails.

2013 Winning Entries
The Olympus BioScapes 2013 winners, honorable mentions, and technical merit awards are displayed in this gallery. In order to view the images, please click on the individual links.
Specimen: Carnivorous U. gibba plant
HHMI Janelia Farm Research Campus
Ashburn, Virginia, United States
Specimen: Open trap of aquatic carnivorous plant, humped bladderwort Utricularia gibba, with single-cell organisms inside.
Technique: Confocal imaging, 100x
2nd Prize - Miss Dorit Hockman Specimen: Molossus rufus embry
Miss Dorit Hockman
University of Oxford
Oxfordshire, United Kingdom
Specimen: Embryo of black mastiff bat Molossus rufus.
Technique: Stereo microscopy
3rd Prize - Dr. Igor Siwanowicz Specimen: Desmids
Dr. Igor Siwanowicz
HHMI Janelia Farm Research Campus
Ashburn, Virginia, United States
Specimen: Single-cell fresh water algae (desmids). Composite image including, concentric from the outside: Micrasterias rotata, Micrasterias sp., M. furcata, M. americana, 2x M. truncata, Euastrum sp. and Cosmarium sp.
Technique: Confocal imaging, 400x
4th Prize - Mr. Spike Walker Specimen: Lily flower bud
Staffordshire, United Kingdom
Specimen: Lily flower bud, transverse section.
Technique: Darkfield illumination, stitched images
5th Prize - Dr. Dylan Burnette Specimen: Mouse fibroblasts
Dr. Dylan Burnette
National Institutes of Health
Bethesda, Maryland, United States
Specimen: Mouse embryonic fibroblasts showing actin filaments (red), mitochondria (green) and DNA (blue).
Technique: Structured illumination microscopy (SIM) fluorescence, acquired with a 60x objective
6th Prize - Mr. Kurt Wirz Specimen: Gonocerus acuteangulatus
Mr. Kurt Wirz
Basel, Switzerland
Specimen: "Brother bugs." Gonocerus acuteangulatus, two hours old. Size 3mm.
7th Prize - Mr. Charles Krebs Specimen: Phantom Midge larva
Mr. Charles Krebs
Issaquah, Washington, United States
Specimen: Phantom midge larva (Chaoborus) "Glassworm." Birefringent musculature that is usually clear and colorless is made visible here by specialized illumination.
Technique: Polarized light, 100X

8th Prize - Dr. Yaron Fuchs Specimen: Mouse tail with stem cells
Dr. Yaron Fuchs
Howard Hughes Medical Institute/The Rockefeller University
New York, NY USA
Specimen: Mouse tail whole mounts showing hair follicle stem cells and proliferating cells.
Technique: Confocal imaging
9th Prize - Mr. Fabrice Parais Specimen: Sericostoma sp.
Mr. Fabrice Parais
DREAL (Regional Directorate of Environment, Planning and Housing) of Basse-Normandie
Caen, France
Specimen: Head and legs of a caddisfly larva: Sericostoma sp., a benthic macroinvertebrate that can be used for freshwater biomonitoring; because it is relatively sensitive to organic pollution and dies if water is dirty, it is a good indicator of water quality.
Technique: Stereo microscopy, 15x
10th Prize - Mr. Ralph Grimm Specimen: Video: Paramecium
Mr. Ralph Grimm
Jimboomba Queensland, Australia
Specimen: Paramecium, showing contractile vacuole and ciliary motion.
Technique: Differential interference contrast, 350x-1000x


2013 Honorable Mentions


C. Barros

M. Boyle

T. Burns

M. Clarke

M. Crutchley

N. Cuenca

S. Di Talia

J. Dolan

G. Drange

J. Ducharme

A. Dumitrache

A. Ertürk

A. Ferrand

M. Ghabril and C. Babbey

M. Gibson

M. Gibson

G. Günther

J. Hallfeldt

T. Hickman

P. Honkakoski

C. Jackson

M. Kandasamy

M. Khodaverdi

M. Klinghardt

L. Knight

L. Knight

A. Kobitski et al.

A. Kohn and J. Kubo

C. Krebs

C. Krebs

M. Lehnert and C. Mulvane

N. Lindström

X. Lu and C. Bolt

G. Luna

D. Maitland

J. Michels

J. Michels

D. Millard

M. Miś

D. Moore

R. Moreno Gill

S. Mouchet

J. Myslowski

W. Nell

J. Nicholson

S. Nishimura

A. Pan

J. Petersen

J. Petersen

A. Phillips-Yzaguirre

C. Pintér

C. Pintér

P. Ray

G. Rouse

A. Salehi

A. Singh

I. Siwanowicz

V. Sýkora

E. Tabdanov

R. Taiariol

V. Tobias Santos

M. Turzańska

M. Turzańska

W. van Egmond

P. Verrees

D. von Wangenheim

L. Windus

K. Wirz

A. Woolley and A. Gilmour

All image copyrights belong to the individual contestants.
For image use permissions, contact ilene@olympusbioscapes.com

ORIGINAL: Olympus Bioscapes