Showing posts with label new technology. Show all posts
Showing posts with label new technology. Show all posts

Tuesday, June 23, 2015


(Image: Siri Stafford/Getty)
Thanks to the latest advances in computer vision, we now have machines that can pick you out of a line-up. But what if your face is hidden from view?
An experimental algorithm out of Facebook's artificial intelligence lab can recognise people in photographs even when it can't see their faces. Instead it looks for other unique characteristics like your hairdo, clothing, body shape and pose.
Modern face-recognition algorithms are so good they've already found their way into social networks, shops and even churches. Yann LeCun, head of artificial intelligence at Facebook, wanted to see they could be adapted to recognise people in situations where someone's face isn't clear, something humans can already do quite well.
"There are a lot of cues we use. People have characteristic aspects, even if you look at them from the back," LeCun says. "For example, you can recognise Mark Zuckerberg very easily, because he always wears a gray T-shirt."
The research team pulled almost 40,000 public photos from Flickr - some of people with their full face clearly visible, and others where they were turned away - and ran them through a sophisticated neural network.
The final algorithm was able to recognise individual people's identities with 83 per cent accuracy. It was presented earlier this month at the Computer Vision and Pattern Recognition conference in Boston, Massachusetts.
An algorithm like this could one day help power photo apps like Facebook's Moments, released last week.
Moments scours through a phone's photos, sorting them into separate events like a friend's wedding or a trip to the beach and tagging whoever it recognises as a Facebook friend. LeCun also imagines such a tool would be useful for the privacy-conscious - alerting someone whenever a photo of themselves, however obscured, pops up on the internet.
The flipside is also true: the ability to identify someone even when they are not looking at the camera raises some serious privacy implications. Last week, talks over rules governing facial recognition collapsed after privacy advocates and industry groups could not agree.
"If, even when you hide your face, you can be successfully linked to your identify, that will certainly concern people," says Ralph Gross at Carnegie Mellon University in Pittsburgh, Pennsylvania, who says the algorithm is impressive. "Now is a time when it's important to discuss these questions."

Facebook can recognise you in photos even if you're not looking

All together now: yeasts can evolve to form snowflake-like multicellular shapes (Image: Courtesy of Jennifer Pentz, Georgia Tech)

The leap from single-celled life to multicellular creatures is easier than we ever thought. And it seems there's more than one way it can happen.

The mutation of a single gene is enough to transform single-celled brewer's yeast into a "snowflake" that evolves as a multicellular organism.

Similarly, single-celled algae quickly evolve into spherical multicellular organisms when faced with predators that eat single cells.

These findings back the emerging idea that this leap in complexity isn't the giant evolutionary hurdle it was thought to be.

At some point after life first emerged, some cells came together to form the first multicellular organism. This happened perhaps as early as 2.1 billion years ago. Others followed – multicellularity is thought to have evolved independently at least 20 times – eventually giving rise to complex life, such as humans.

But no organism is known to have made that transition in the past 200 million years, so how and why it happened is hard to study.

Special snowflake

Back in 2011, evolutionary biologists William Ratcliff and Michael Travisano at the University of Minnesota in St Paul coaxed unicellular yeast to take on a multicellular "snowflake" form by taking the fastest-settling yeast out of a culture and using it to found new cultures. And then repeating the process. Because clumps of yeast settle faster than individual cells, this effectively selected yeast that stuck together instead of separating after cell division.

The team's latest work shows that this transformation from a single to multicellular existence can be driven by a single gene calledACE2 that controls separation of daughter cells after division, Ratcliff told the 15-19 June Astrobiology Science Conference in Chicago.

And because the snowflake grows in a branching, tree-like pattern, any later mutations are confined to single branches. When the original snowflake gets too large and breaks up, these mutant branches fend for themselves, allowing the value of their new mutation to be tested in the evolutionary arena.

"A single mutation creates groups that as a side effect are capable of Darwinian evolution at the multicellular level," says Ratcliff, who is now at Georgia Tech University in Atlanta.

Bigger is better

Ratcliff's team has previously also evolved multicellularity in single-celled algae calledChlamydomonas, through similar selection for rapid settling. The algal cells clumped together in amorphous blobs.

Now the feat has been repeated, but with predators thrown into the mix. A team led byMatt Herron of the University of Montana in Missoula exposed Chlamydomonas to a paramecium, a single-celled protozoan that can devour single-celled algae but not multicellular ones.

Safety in even numbers (Image: Jacob Boswell)

Sure enough, two of Herron's five experimental lines became multicellular within six months, or about 600 generations, he told the conference.

This time, instead of daughter cells sticking together in an amorphous blob as they did under selection for settling, the algae formed predation-resistant, spherical units of four, eight or 16 cells that look almost identical to related species of algae that are naturally multicellular.

"It's likely that what we've seen in the predation experiments recapitulates some of the early steps of evolution," says Herron.

Neither Ratcliff's yeast nor Herron's algae has unequivocally crossed the critical threshold to multicellularity, which would require cells to divide labour between them, says Richard Michod of the University of Arizona in Tucson.

But the experiments are an important step along that road. "They're opening up new avenues for approaching this question," he says.

One gene may drive leap from single cell to multicellular life

Will more sensory substitution devices hit the market soon?

The BrainPort V100

Courtesy Wicab, Inc.

Last week, the Food and Drug Administration (FDA) announced that medical device company Wicab is allowed to market a new device that will help the blind “see.” The device, called theBrainPort V100, can help the blind navigate by processing visual information and communicating it to the user through electrodes on his tongue. Though this isn’t the first device to go on the market using sensory substitution (where information perceived by one sense is communicated through another), the sophistication and usability of the BrainPort V100 could mean that the number of sensory substitution devices permitted by the FDA is on the rise.

The BrainPort V100 consists of a pair of dark glasses and tongue-stimulating electrodes connected to a handheld battery-operated device. When cameras in the glasses pick up visual stimuli, software converts the information to electrical pulses sent as vibrations to be felt on the user’s tongue. Like most sensory substitution devices, “seeing” with your tongue may not be intuitive at first. But the researchers who developed the device tested it over the course of a year, training users to interpret the vibrations. Studies showed that 69 percent of the test subjects were able to identify an object using the BrainPort device after a year of training. However, the device is expensive; Wicab toldPopular Science that it will cost $10,000 per unit, the same as its price when first reported back in 2009.

Researchers have been fiddling withsensory substitution for a long time, but most of these devices are not yet widely available. The BrainPort V100 will be on one of the first, having passed the FDA’s review through recently-updated guidelines called the premarket review pathway: “a regulatory pathway for some low- to moderate-risk medical devices that are not substantially equivalent to an already legally-marketed device,” according to the press release. Since this device is now allowed to be marketed and was approved relatively quickly through these new guidelines, the BrainPort may be paving the way for an explosion of sensory substitution devices to hit the market in the next few years, which could help the growing numbers of Americans with sensory impairments.

Device That Helps Blind People See With Their Tongues Just Won FDA Approval

Technology can reconstruct video based on a person's thoughts and even anticipate your moves while you drive. Now, a brain-to-text system can translate brain activity into written words.

In a recent study in Frontiers in Neuroscience, seven patients had electrode sheets placed on their brain which collected neural data while they read passages aloud from the Gettysburg Address, JFK’s inaugural speech, and Humpty Dumpty.

As each patient spoke, a computer algorithm learned to associate speech sounds—such as "foh", "net", and "ik"—with different firing patterns in the brain cells. Eventually it learned to read the brain cells well enough that it could guess which sound they were producing with up to 75 percent accuracy. But the program doesn't need 100 percent accuracy to put those sounds together into the word "phonetic". Because our speech only takes certain forms, the system’s algorithm can correct for these errors “just like autocorrect,” says Peter Brunner, one of the co-authors of the study.

“Siri wouldn’t be more accurate than 50 or 70 percent,” he says. “Because it knows what the potential options are that you choose, or the typical sentences that you say, it can actually utilize this information to get the right choice.”

It is important to record the data directly from the brain, says Brunner, because picking up neural activity from the scalp only gives a “blurred version” of what is happening in the brain. He likened the latter method to flying 1000 feet above a baseball stadium and only being able to vaguely recognize that people are cheering, but not the specifics of what the people’s faces look like.

In this case, the patients were already undergoing an epilepsy procedure where the skull is popped open and an electrode grid is placed on the brain to map areas where neurons are misfiring. The resourceful team of researchers from the National Center for Adaptive Neurotechnologies and the State University of New York at Albany used this time to conduct their own research. However, it means study was limited by each patient’s individualized epilepsy treatment, such as where the electrodes were placed on the brain.

Because every person’s brain is so unique, and the neural activity must be picked up directly from the brain, it would be difficult to create a general brain-to-text device for the average consumer, says Brunner. However, this technology has a lot of potential to be used for people who suffer from neurological diseases, such as ALS, who lose the ability to move and to speak. Instead of using an external device like Steven Hawking to pick out words on a screen for a computer to read, the computer would simply speak your mind.

“This is just the beginning,” said Brunner. “The prospects of this are really endless.”

Mind-Reading Program Translates Thoughts Into Text

Saturday, June 20, 2015

In order to understand how the organ selectively transmits cells from mother to child

A placenta on a chip device

Courtesy of the Eunice Kennedy Shriver National Institute of Child Health and Human Development

From lungs to brains, organ tissues grown on a lab are telling researchers a lot about how their cells do their jobs. Now researchers are using the technology to better understand the placenta, the temporary organ that connects a fetus and mother during pregnancy.

The placenta’s primary function is to act as a “crossing guard” between mother and child—it sends the good stuff (like nutrients and oxygen) along to the baby, while leaving other damaging elements like chemicals from environmental exposure or disease-causing bacteria or viruses. If the placenta is damaged or doesn’t work right, it could endanger the health of both the mother and the baby.

Researchers don’t really know how the placenta is able to transmit the good things while keeping out the bad. That’s because the placenta is notoriously difficult to study in humans—it takes a long time, varies a lot between individual patients, and could put the fetus’ safety at risk. In the past, most studies about the placenta were done in animals to work around these issues. Animal studies have shed some light on how the placenta works, but the tissue is never quite the same as in humans.

A rendering of the placenta on a chip

Scientists Are Growing A Human Placenta On A Chip

Friday, June 19, 2015

Researchers have built devices that harness changes in atmospheric humidity to generate small amounts of electricity, lift tiny weights, and even power a toy car. In the grand scheme of things, that captured energy is not free, but it’s pretty darn close. The study suggests that evaporation could be used to operate a variety of gadgets that don’t require a lot of power, scientists say.

“This is one of the first experiments to show that humidity can be a source of fuel,” says Albert Schenning, a materials scientist at the Eindhoven University of Technology in the Netherlands who wasn’t involved in the new study. The team’s designs, he says, “are very nice and very clever.”

All the gadgets rely on a simple phenomenon—the change in size of bacterial spores as they absorb moisture from the air and then release it, says team leader Ozgur Sahin, a biophysicist at Columbia University. Sahin and his colleagues used the living but dormant spores from Bacillus subtilis, a species of bacteria commonly found in soil and in the human gastrointestinal tract. Each spore typically swells and then shrinks up to 6% when moved from dry air to extremely humid air and then back again, Sahin says. The researchers harnessed that size-changing action by gluing thin layers of spores onto one side of curved sheets of polymer. When the spores swelled, that side of the polymer sheet lengthened—which in turn caused the curved sheet to somewhat straighten out. The stretching and contracting of these spore-coated polymer sheets are the driving force for the team’s devices.

A change in size of 6% may not sound that impressive. But when the researchers strung together a series of these polymer sheets, the “artificial muscles” they created quadrupled in length when relative humidity changed from below 30% to more than 80%, the team reports today in Nature Communications.  

The thicker the spore layer, the longer it would take for the muscles to react to changes in humidity. So, to make sure their artificial muscles were quick-acting, the researchers used spore layers that were extremely thin—no more than 3 micrometers thick, or about 5 spores deep on average, Sahin says. Tests showed that the devices could react to humidity changes within 3 seconds, he notes.

Tests also revealed that the spore-coated polymer strips could expand and shrink for more than 1 million cycles with little change in their range of motion. Other trials showed that the strips, when they shrank, could lift more than 50 times their own weight (video). But they did so much more slowly than an animal’s muscle would, so the power they generated—that is, their rate of energy production—was correspondingly low.

Nevertheless, the team harnessed changes in humidity to perform actual work. In one device, the back-and-forth motion driven by one artificial muscle suspended above a postage stamp–sized patch of water provided enough electrical power to light an LED. In another, the expansion of muscles on one side of a Ferris wheel–like device (where the air was humidified by evaporation from a wet paper towel) but not the other triggered an imbalance that caused the wheel to rotate (video). The team used the motion of a similar wheel to power a 100-gram toy car(video).

“These are fun demonstrations, but they prove the principle,” says Peter Fratzl, a materials scientist at the Max Planck Institute of Colloids and Interfaces in Potsdam, Germany, who was not involved with the work. Researchers are constantly looking for sources of energy, even if they’re small, he notes. “It makes sense to use these gradients [in humidity], because they’re everywhere and they’re free.”

The team’s results are a good conceptual starting point, says George Whitesides, a chemist at Harvard University. Such devices could, in theory, generate enough electricity to run a few transistors, he adds. “But it will still be a while before these things are in every child’s bathtub.”

Energy harnessed from humidity can power small devices

Thursday, June 18, 2015


In an attempt to reverse evolution, the team has already made significant strides in mutating chickens back to the very creatures from which they descended. If that wasn’t enough genetic splicing and dicing, Harvard scientists attempted a similar feat recently by inserting the genes of a woolly mammoth into elephants in order to recreate the extinct beasts. Whoa baby.

If the four major differences between dinosaurs and birds are their tails, arms, hands and mouths, Horner and team have already flipped certain genetic switches in chicken embryos to reverse-engineer a bird’s beak into a dinosaur-like snout.

“Actually, the wings and hands are not as difficult,” Horner said, adding that a ‘Chickensoraus’ -- as he calls the creation -- is well on its way to becoming reality. “The tail is the biggest project. But on the other hand, we have been able to do some things recently that have given us hope that it won't take too long."

Scientists Say They Can Recreate Living Dinosaurs Within the Next 5 Years

#Biointerfaces #Chip – Biointerfaces Has Developed a Chip to Mimic Heartbeats Using Gravity – Researchers at the University of Michigan managed to mimic a heartbeat outside of the body, mimicking fundamental physical rhythms like the heartbeat.

Developed as a “lab on a chip,” microfluidic devices that can be extremely useful when performing complex laboratory functions in a tiny space.

Being an instant success in heartbeat mimicking, researchers have already started testing cardiovascular drugs and blood thinners, where blood flow and its accurate simulation can help develop new studies and medical solutions.

Apparently, cells will react more natural when subjected to the pulsing rhythms inside a body or when in motion, instead of the static environment of the lab. This way doctors will be able to test and simulate cell motion much more accurately before testing on live subjects.

Just to make an idea on how primitive heartbeat simulations were outside of a body before this new heart-on-a-chip arrived, doctors had to operate a syringe pump operated by a lab technician for a limited amount of time. The new device not only eliminated the human factor in simulating a heartbeat, but can also operate in infinitely longer amounts of time.

Biointerfaces Has Developed a Chip to Mimic Heartbeats Using Gravity

Tuesday, June 9, 2015

Emerging Technologies – Most of the global challenges of the 21st century are a direct consequence of the most important technological innovations of the 20st century.

New technology is arriving faster than ever and holds the promise of solving many of the world’s pressing challenges such as food and water security, energy sustainability and personalised medicine.

Lighter, cheaper and flexible electronics made from organic materials have found endless practical applications and drugs are being delivered via nanotechnology at the molecular level, at the moment just in medical labs.

However, outdated government regulations, inadequate existing funding models for research and uninformed public opinion are the greatest challenges in effectively moving emerging technologies from the research labs to people’s lives.

1) Robotics 2.0

A new generation of robotics takes machines away from just automating the most manual manufacturing assembly line tasks and orchestrates them to collaborate in creating more advanced assemblies, subassemblies and complete products. Collaborative robotics can accelerate time-to-market, improve production accuracy and reduce rework. By using GPS technology that is commonly available in smartphones, robots can be used in precision agriculture for weed control and harvesting.

We’ve seen robots that can walk like an ape and run like a cheetah, robots that can mix a perfect martini, help the disabled, or drive you to the store. Robots could replace soldiers on the battlefield. In Japan, robots are being tested in nursing roles: they help patients out of bed and support stroke victims in regaining control of their limbs.

Artificial Intelligence, machine learning and computer vision are constantly developing and perfecting new technologies that “enable the machine” to perceive and respond to its ever changing environment. Emergent AI is the nascent field of how systems can learn automatically by assimilating large volumes of information. An example of this is how Watson system developed by IBM is now being deployed in oncology to assist in diagnosis and personalised, evidence-based treatment options for cancer patients.

2) Neuromorphic Engineering

Neuromorphic engineering, also known as neuromorphic computing started as a concept developed by Carver Mead in the late 1980s, describing the use of very-large-scale integration (VLSI) systems containing electronic analogue circuits to mimic neurobiological architectures present in the nervous system.

A key aspect of neuromorphic engineering is understanding how the morphology of individual neurones, circuits and overall architectures creates desirable computations, affects how information is represented, influences robustness to damage, incorporates learning and development, adapts to local change (plasticity), and facilitates evolutionary change. Neuromorphic Computing is next stage in machine learning.

IBM’s million “neurones” TrueNorth chip, revealed in prototype in August 2014, has a power efficiency for certain tasks that is hundreds of times superior to conventional CPU’s and comparable for the first time to the human cortex. The challenge here remains creating code that can realise the potential of the TrueNorth chip, an area IBM continues investing in today.

3) Intelligent Nanobots – Drones

Again, Emergent AI and Computer Vision will provide drones with human like capabilities allowing them to complete tasks too dangerous or remote for humans to do like checking electric power lines or delivering medical supplies in an emergency for example.

Autonomous drones will improve agricultural yields by collecting and processing vast amounts of visual data from the air, allowing precise and efficient use of inputs such as fertiliser and irrigation.

Ambulance drones that can deliver vital medical supplies and “on screen” instructions. Drones with mounted camera to “learn” about surroundings – with no information about the environment or the objects within it- by using reference points and different angles, it builds a 3D map of surroundings, with additional sensors picking up barometric and ultrasonic data. Autopilot software then uses all this data to navigate safely and even seek out specific objects. Autonomous. Intelligent. Swarming. Nano Drones.

4) 3D Printing (yes, we can’t get enough of it)

Imagine a future in which a device connected to a computer can print a solid object. A future in which we can have tangible goods as well as intangible services delivered to our desktops or high-street shops over the Internet.

And a future in which the everyday “atomisation” of virtual objects into hard reality has turned the mass pre-production and stock-holding of a wide range of goods and spare parts into no more than an historical legacy.

Emerging Technologies in 3D printing that eventually lead to lighter, more efficient plane parts that could save fuel on your flights, replacement body parts, from printed knickers to 3D printed pills and from synthetic hearts to 3D printed homes on Mars and other planers.

5) Precision Medicine

From heart disease to cancer, all have a genetic component. Cancer is best described as a disease of the genome. The ability to sequence a patient’s whole genome is close to entering the clinic in cancer hospitals.

We’ve come a long way in technology since we sequenced the first genome, and as it gets faster and cheaper we will be able to both personalise treatments and learn more about the enormous number of genetic changes that lead to each type and subtype of cancer.

With digitisation, doctors will be able to make decisions about a patient’s cancer treatment informed by a tumour’s genetic make-up.

This new knowledge is also making precision medicine a reality by enabling the development of highly targeted therapies that offer the potential for improved treatment outcomes, especially for patients battling cancer and finally by combining whole-genome sequencing (or transcriptome sequencing) with advancing sequence-based drug technology the future really looks amazing!

Top 5 Emerging Technologies In 2015

Monday, March 23, 2015

Future farming a technology utilization

Unmanned Aerial Vehicle (UAV) filming a combine harvester in wheat field in Provence-Alpes-Cote d'Azur, France.


 Today’s agriculture has transformed into a high-tech enterprise that most 20th-century farmers might barely recognize.
After all, it was only around 100 years ago that farming in the U.S. transitioned from animal power to combustion engines. Over the past 20 years the global positioning system (GPS), electronic sensors and other new tools have moved farming even further into a technological wonderland.
Beyond the now de rigeur air conditioning and stereo system, a modern large tractor’s enclosed cabin includes computer displays indicating machine performance, field position and operating characteristics of attached machinery like seed planters.
And as amazing as today’s technologies are, they’re just the beginning. Self-driving machinery and flying robots able to automatically survey and treat crops will become commonplace on farms that practice what’s come to be called precision agriculture.
The ultimate purpose of all this high-tech gadgetry is optimization, from both an economic and an environmental standpoint. We only want to apply the optimal amount of any input (water, fertilizer, pesticide, fuel, labor) when and where it’s needed to efficiently produce high crop yields.

Global positioning gives hyperlocal info

GPS provides accurate location information at any point on or near the earth’s surface by calculating your distance from at least three orbiting satellites at once. So farming machines with GPS receivers are able to recognize their position within a farm field and adjust operation to maximize productivity or efficiency at that location.
Take the example of soil fertility. The farmer uses a GPS receiver to locate preselected field positions to collect soil samples. Then a lab analyzes the samples, and creates a fertility map in a geographic information system. That’s essentially a computer database program adept at dealing with geographic data and mapping. Using the map, a farmer can then prescribe the amount of fertilizer for each field location that was sampled. Variable-rate technology (VRT) fertilizer applicators dispense just exactly the amount required across the field. This process is an example of what’s come to be known as precision agriculture.

Info, analysis, tools

Precision agriculture requires three things to be successful. It needs site-specific information, which the soil-fertility map satisfies. It requires the ability to understand and make decisions based on that site-specific information. Decision-making is often aided by computer models that mathematically and statistically analyze relationships between variables like soil fertility and the yield of the crop.
Finally, the farmer must have the physical tools to apply the management decisions. In the example, the GPS-enabled VRT fertilizer applicator serves this purpose by automatically adjusting its rate as appropriate for each field position. Other examples of precision agriculture involve varying the rate of planting seeds in the field according to soil type and using sensors to identify the presence of weeds, diseases, or insects so that pesticides can be applied only where needed.
Examples of remote sensing in agriculture, top to bottom: vegetation density, water deficit and crop stress.
IMAGE: SUSAN MORAN/NASA
Site-specific information goes far beyond maps of soil conditions and yield to include even satellite pictures that can indicate crop health across the field. Such remotely sensed images are also commonly collected from aircraft. Now unmanned aerial vehicles (UAVs, or drones) can collect highly detailed images of crop and field characteristics. These images, whether analyzed visually or by computer, show differences in the amount of reflected light that can then be related to plant health or soil type, for example. Clear crop-health differences in images – diseased areas appear much darker in this case – have been used to delineate the presence of cotton root rot, a devastating and persistent soilborne fungal disease. Once disease extent is identified in a field, future treatments can be applied only where the disease exists. Advantages of UAVs include relatively low cost per flight and high image detail, but the legal framework for their use in agriculture remains under development.

Let’s automate

Automatic guidance, whereby a GPS-based system steers the tractor in a much more precise pattern than the driver is capable of a tremendous success story. Safety concerns currently limit completely driverless capability to smaller machines. Fully autonomous or robotic field machines have begun to be employed in small-scale high profit-margin agriculture such as wine grapes, nursery plants and some fruits and vegetables.
Autonomous machines can replace people performing tedious tasks, such as hand-harvesting vegetables. They use sensor technologies, including machine vision that can detect things like location and size of stalks and leaves to inform their mechanical processes. Japan is a trend leader in this area. Typically, agriculture is performed on smaller fields and plots there, and the country is an innovator in robotics. But autonomous machines are becoming more evident in the U.S., particularly in California where much of the country’s specialty crops are grown.
The development of flying robots gives rise to the possibility that most field-crop scouting currently done by humans could be replaced by UAVs with machine vision and hand-like grippers. Many scouting tasks, such as for insect pests, require someone to walk to distant locations in a field, grasp plant leaves on representative plants and turn them over to see the presence or absence of insects. Researchers are developing technologies to enable such flying robots to do this without human involvement.

Breeding + sensors + robots

High-throughput plant phenotyping (HTPP) is an up-and-coming precision agriculture technology at the intersection of genetics, sensors and robotics. It is used to develop new varieties or “lines” of a crop to improve characteristics such as nutritive content and drought and pest tolerance. HTPP employs multiple sensors to measure important physical characteristics of plants, such as height; leaf number, size, shape, angle, color, wilting; stalk thickness; number of fruiting positions. These are examples of phenotypic traits, the physical expression of what a plant’s genes code for. Scientists can compare these measurements to already-known genetic markers for a particular plant variety.
The sensor combinations can very quickly measure phenotypic traits on thousands of plants on a regular basis, enabling breeders and geneticists to decide which varieties to include or exclude in further testing, tremendously speeding up further research to improve crops.

Just another day on the future farm?

IMAGE: FLICKR, MAURICIO LIMA
Agricultural production has come so far in even the past couple decades that it’s hard to imagine what it will look like in a few more. But the pace of high-tech innovations in agriculture is only increasing. Don’t be surprised if, 10 years from now, you drive down a rural highway and see a very small helicopter flying over a field, stopping to descend into the crop, use robotic grippers to manipulate leaves, cameras and machine vision to look for insects, and then rise back above the crop canopy and head toward its next scouting location. All with nary a human being in sight.

The technology is future of farming

Sunday, February 22, 2015


Scientists are developing new contact lenses that could give wearers superhero-like vision.
The revolutionary lenses would allow individuals to zoom in and out, but which may also offer hope to thousands of elderly people suffering from vision loss.
The prototype lenses, presented at the American Association for the Advancement of Science’s yesterday in San Jose, California, contain tiny aluminium telescopes that would interact with a pair of specially designed glasses to allow you to toggle between normal and ‘zoomed in’ viewing.
The operating instructions tell users to wink the right eye to zoom and the left to zoom out.
 The prototype lenses

The lenses, which were first developed with the US Defence Advance Research Projects Agency as super-thin cameras for aerial drones, have been retooled as a possible solution for those with age-related macular degeneration.
Macular degeneration is when the light receptors on the inner surface of the eye are lost, resulting in blurred central vision. This can make performing normal talks – such as reading – next to impossible.
Eric Tremblay, from the Swiss Federal Institute of Technology in Lausanne, said: “We think these lenses hold a lot of promise for low vision and age-related macular degeneration.
“At this point, it is still research but we are very hopeful it will eventually become a real option for people with age-related macular degeneration,”
However, before the lenses can be rolled out scientists need to create a lighter, more ‘breathable’, version.
The rigid prototypes, which still allow light through the centre of the lenses, are bigger and thicker than traditional lenses but surrounded by mirrors.
It is these mirrors that provide the ‘superhero’ elements of the contact lenses. By bouncing the light around the view – as seen by the wearer – is magnified.
The spectacles would control the level of magnification.  

Scientists develop prototype contact lenses that allow wearers to zoom in and out

Wednesday, December 24, 2014

A new microscope attachment can allow smartphone users to take acloser look at fluorescently labeled DNA.

Electrical and bioengineer Aydogan Ozcan of the University of California, Los Angeles, and his colleagues have developed a smartphone attachment that can estimate the lengths of DNA molecules in a sample, according to a study published this month (December 10) in ACS Nano. The unit—which weighs less than 190 grams, costs only $400, and runs on three AAA batteries—can reveal copy-number variations and other genetic features of disease, making it a potential tool for diagnostic field tests,Chemical & Engineering News (C&EN) reported.

The researchers demonstrated the smartphone microscope’s utility by analyzing purified solutions of fluorescently labeled DNA molecules. By putting the solution between two coverslips, they effectively stretched the DNA into straight lines; then, a compact blue laser within the fluorescence microscope attachment shone on the DNA, and the smarphone took a series of photos that were sent to a remote server to calculate the strand lengths. Testing the setup on DNA molecules that ranged from 10,000 to 48,000 base pairs, the team found that the smartphone microscope could estimate length within about 1,000 base pairs, similar to the error of conventional bench-top fluorescence microscopes, according to C&EN.

Other groups are also hoping to make smartphone microscopes, to bring the power of microscope-based diagnostics to parts of the world lacking the necessary infrastructure—often the places most in need of strategies to quickly diagnose infectious diseases. Ozcan compares the movement to the personal computer revolution. “If you look at the early computers, they were bulky, they were extremely expensive,” he told The Scientist last year. Now, “[computers] are portable . . . and almost anyone can afford them. The same thing is going on today [with microscopy]. We are miniaturizing our micro- and nano-analysis tools. We’re making them more affordable; we’re making them more powerful.”

Measuring DNA with a Smartphone a Next Generation Technology…

Wednesday, November 12, 2014

Mind-controlled transgene expression by a wireless-powered optogenetic designer cell implant

Today I would like to discuss something about a new invention happened in the research field. Just imagine the world where we can control anything with our mind. Yes that is true that there are gadget in developmental stage, by which we can control other devices just by our 'brain waves' or in simple word 'by our thought'. But we crossed the limits by developing something new, by which we can control gene expressions by though- brain waves.

Schematic representation of mind-controlled transgene expression.
Schematic representation of mind-controlled transgene expression.

Future Treatment Technique For arthritis, Diabetes, Obesity

Mammalian synthetic biology has significantly advanced the design of gene switches that are responsive to trace less cues such as light, gas and radio waves, complex gene circuits, including oscillators, cancer-killing gene classifiers and programmable biocomputers, as well as prosthetic gene networks that provide treatment strategies for gouty arthritis, diabetes and obesity. Akin to synthetic biology promoting prosthetic gene networks for the treatment of metabolic disorders cybernetics advances the design of functional man–machine interfaces in which brain–computer interfaces (BCI) process brain waves to control electromechanical prostheses, such as bionic extremities and even wheel chairs. The advent of synthetic optogenetic devices that use power-controlled, light-adjustable therapeutic interventions will enable the merging of synthetic biology with cybernetics to allow brain waves to remotely control the transgene expression and cellular behaviour in a wireless manner.

Future Of Gene And Cell Based Treatments using optogenetic Implant

Synthetic devices for traceless remote control of gene expression may provide new treatment opportunities in future gene- and cell-based therapies. Here we report the design of a synthetic mind-controlled gene switch that enables human brain activities and mental states to wirelessly programme the transgene expression in human cells. An electroencephalography (EEG)-based brain–computer interface (BCI) processing mental state-specific brain waves programs an inductively linked wireless-powered optogenetic implant containing designer cells engineered for near-infrared (NIR) light-adjustable expression of the human glycoprotein ​SEAP (​secreted alkaline phosphatase). The synthetic optogenetic signalling pathway interfacing the BCI with target gene expression consists of an engineered NIR light-activated bacterial diguanylate cyclase (DGCL) producing the orthogonal second messenger ​cyclic diguanosine monophosphate (​c-di-GMP), which triggers the ​stimulator of interferon genes (​STING)-dependent induction of synthetic ​interferon-βpromoters. Humans generating different mental states (biofeedback control, concentration, meditation) can differentially control ​SEAP production of the designer cells in culture and of subcutaneous wireless-powered optogenetic implants in mice.

Optogenetic devices operating in the near-infrared (NIR) spectral range combine high tissue penetration power with negligible phototoxicity. The phototrophic bacterium Rhodobacter sphaeroides is able to capture NIR light with the multidomain protein ​BphG1, which contains an amino-terminal (N-terminal) NIR light sensor and carboxyl-terminal diguanylate cyclase (DGC) domain, as well as phosphodiesterase (PDE) activities, to control the level of the ubiquitous bacterial second messenger ​cyclic diguanosine monophosphate (​c-di-GMP) and orchestrate the environmental light-triggered transition from motile cells to biofilm-forming communities. ​Stimulator of interferon genes (​STING) was recently identified as a novel player in the human innate immunity that functions as a cyclic di-nucleotide sensor (​cGAMP, ​c-di-AMP, ​c-di-GMP) to detect the presence of cytosolic DNA via ​cyclic-GMP–AMP (cGAMP) synthase (​cGAS)-mediated production of ​cGAMP, as well as second messengers (​c-di-AMP, ​c-di-GMP) released from intracellular pathogens. Activated ​STING specifies the phosphorylation of the ​interferon-regulatory factor 3 (​IRF3) by ​tank-binding kinase 1, which results in the nuclear translocation of ​IRF3, binding to ​IRF3-specific operators and induction of type I interferon promoters. In this study, we rewire BCI-triggered NIR light-based induction of ​c-di-GMP production by ​BphG1 variants to ​c-di-GMP-dependent ​STING-driven activation of optimized interferon-responsive promoters to enable mind-controlled transgene expression in mammalian designer cells inside subcutaneous wireless-powered optogenetic implants in mice. Cybernetic control of synthetic gene networks in designer mammalian cells may pave the way for mind-genetic interfaces in future treatment strategies.

Wireless-powered optogenetic implant

The wireless-powered optogenetic implant was a fully sealed, all-in-one biocompatible device comprising a power receiver, which was remotely powered by electromagnetic induction controlled by the field generator, and the 700-nm NIR LED (λmax=700 nm, 20 mW sr−1; cat. no. ELD-700-524-1; Roithner Lasertechnik, Vienna, Austria), which enabled light-programmable transgene expression of designer cells inside the semi-permeable cultivation chamber (Fig. a–c). The power receiver’s antenna was assembled from three orthogonal copper coils (0.1-mm copper wire with 130 windings on a 7 × 7 × 7 mm ferrite cube), three in-series resonance capacitors and six Schottky diodes, which integrated and rectified the current of the three coils and powered the NIR LED in an orientation- and motion-independent manner ( Fig. b) The entire power receiver, including the base of the NIR LED, was moulded into a spherical polycarbonate cap containing polydimethylsiloxane (PDMS; cat. no. 701912-1, Sigma-Aldrich, Buchs, Switzerland) and fitted to a custom-adapted 500-μl polycarbonate chamber (0.4 × 0.9 mm) with semi-permeable polyethersulfone <300 kDa-cutoff membranes (PES Membrane, VS0651, Sartorius Stedim Biotech, Germany) on two sides (Fig. a). The device was sealed by polymerizing the PDMS for 30 min at 50 °C. The coupling intensity of the wireless-powered optogenetic implant was profiled in the space above the field generator by scoring the wireless transmission of power to the implant . A total of 500 μl of a pSO3/pSO4- or pSO3/pSBC-2 (negative control)-transgenic HEK-293F cell suspension (1 × 106cells) was loaded via a syringe through a hole in the polycarbonate side of the culture chamber, which was sealed with a PDMS plug before implanting the device subcutaneously into the mouse.
 Wireless-powered optogenetic implant.
Wireless-powered optogenetic implant.
(a) Wireless-powered implant on the field generator with an illuminated NIR LED. A 1 CHF coin (23 mm in diameter) serves as a size indicator. The 0.5-ml cultivation chamber containing semi-permeable PES membranes on both sides was moulded to a spherical polycarbonate cap contain a PDMS-sealed three-dimensional (3D) receiver antenna wired to the NIR-LED. (b) 3D receiver antenna wired via the receiver circuit (receiver coils, resonance capacitors, Schottky diodes; Supplementary Figs 5 and 11) to the NIR LED. (c) Quality-control test of the custom-made wireless-powered optogenetic implants illuminated while standing on the powered field generator. (d) Mouse with a subcutaneous wireless-powered optogenetic implant, the activity of which can be observed through the skin. (e) Field generator.

Mind-controlled transgene expression in mice

Cell-containing wireless-powered optogenetic implants were subcutaneously implanted on the backs of short-term ​isoflurane-anaesthetized wild-type mice (Oncins France souche 1, Charles River Laboratories, Lyon, France), and the cage containing the treated animals was placed on the field generator connected to the BCI. The human subject wearing the BCI headset conducted three different mental states, biofeedback, concentration and meditation, which were integrated (5/25/25 min) and converted to threshold (meditation-meter values 90/75/75)-dependent activation of the time-delay relay that switched the NIR LED in the wireless-powered optogenetic implant ON for defined periods of time (60 min/30 s/30 s) and induced light-triggered ​SEAP expression in the implanted cells. After 48 and 144 h, blood samples were collected retro-orbitally, and serum ​SEAPlevels were determined as described above. The implants of one treatment group were removed after ​SEAP profiling at 48 h, and the serum ​SEAP levels were quantified again 96 h after implant removal. Control mice received wireless-powered optogenetic implants containing pSO3/pSBC-2-transfected HEK-293F cells. Throughout the entire animal study, five 4-week-old female Oncin Souche 1 wild-type mice of the delivered pool were randomly allocated to the individual treatment groups. Neither samples nor animals were excluded from the study and blood-sample analysis was blinded

Controlling Gene Expression With Our Mind

 
Hi-Tech Talk © 2015 - Designed by Templateism.com