Thursday 23 January 2014


Manned mission to Mars by 2030 actually possible


A panel of experts has claimed that a manned mission to Mars could be feasible by the 2030s.
A workshop group of more than 60 individuals representing more than 30 government, industry, academic and other organizations has found that a NASA-led manned mission to Mars is feasible if the space agency's budget is restored to pre-sequestration levels.

Putting the first humans on the Red Planet would also require international cooperation and private industry support.
There is a growing consensus among the space community that a manned mission to Mars should be a priority worth working toward in the coming years, according to Chris Carberry the executive director of Explore Mars Inc., the organization that hosted the workshop with the American Astronautical Society.

Carberry told SPACE.com that to be able to make it feasible and affordable, you need a sustainable budget that is consistent, which you can predict from year to year and that doesn't get canceled in the next administration.
While Carberry said that it is possible to launch a manned mission to Mars by the 2030s under pre-sequestration budget levels, a NASA-led human mission to Mars will probably never launch under current budgetary constraints, he added.

Shailesh shukla
directoratace@gmail.com

Contact lens as a diagnostic tool

Advances in mini and microcomputers are turning science fiction into reality. A couple of months ago, we had a visitor from MIT to our lab who was wearing a strange kind of spectacles. When I asked him about it, he said it was actually a wearable computer called Google Glasses, made by Google. And of course we now know of a wrist watch computer made by Samsung. Last week Google announced the introduction of a wearable contact lens which would monitor the sugar levels in your tears and let you know if you are a diabetic or not. With this, you no longer need to ‘invade’ or prick your finger to draw blood and wet it on a litmus-type paper to read your sugar levels. And we all thought that a contact lens is worn to correct your eye sight to normal.
Contact lens can be used to diagnose glaucoma, blood pressure and diabetes. Photo: R. Ragu
So, with the Google contact lens, there is literally more than what meets the eye! We have come a long way since 1508 when the great Italian Leonardo da Vinci thought up the idea of slipping a glass piece over the eye to correct vision, and 1823 when the British physicist John Herschel thought up a practical design.
Fifty years later, such a glass was made, though it covered the entire eye. With the advent of plastics, the first lightweight contact lens was made in the year I was born, 1939, and was made to cover not the whole eye but the corneal surface.
But it was Drs. Otto Wichterle and Drahoslav Lim in 1959 who introduced the hydrophilic ‘soft’ contact lens. Currently we have contact lenses that you can wear and sleep, lenses that are disposable after each use, and those meant for fashionistas.
A typical contact lens is lighter than feather, has a diameter of about 14 mm, curvature of about 8.7 mm, fitting smug over the cornea and held in place thanks to the surface tension of the tear fluid that wets it.
And it is this tear fluid that holds the key for the diagnostics. Produced by the tear glands on the outer surface of the eye, it contains hundreds of proteins and metabolite molecules, and thus an indicator of the health of the body.
Non-invasive
And since one does not have to pierce the body to collect blood but simply collect it or study it as it is held between the cornea and the contact lens, it becomes an attractive diagnostic fluid.
All that one needs to do is to fit the contact lens with an appropriate sensor which measures chosen properties or levels of any component in it.
This last sentence is easier said than done; and it is here that innovation has played a role. Early enough, in the 1990s and 2000s, Drs Matteo Leonardi and Rene Goedkoop from Switzerland, supported by ‘Sensimed’, used the contact lens to measure the pressure within the eyeball, also called the intraocular pressure (IOP), which is an indication of the pressure that the optic nerve feels.
If the IOP becomes higher than normal, the optic nerve can become inefficient over time, thanks to this higher than normal pressure and can eventually lose its activity, leading to loss of vision. This condition is termed glaucoma, a silent stealer of vision.
What the duo did was to put together a circular strain gauge on the edges of the contact lens in order to measure the changes in the circumference of the outer surface of the eye due to IOP, and read out as electrical signals. This was an alternative to the conventional method of using a pressure sensor (tonometer) with which the eye doctor would contact and slightly press the curved corneal surface (applanation) and measure the intraocular pressure.
Any change beyond the accepted ‘normal’ range of IOP would be diagnosed as possible glaucoma.
The Leonardi-Goedkoop machine, termed Triggerfish, does this in a more convenient way. Likewise, Drs Stodtmeister and Jonas Jost of Germany devised a method to measure the systolic and diastolic pressures of the ophthalmic artery, and have used it as a method to make blood pressure measurements.
And in all this the main function of the contact lens (to correct the refractive ‘power’) was not affected so that it does double duty.
What Drs Brian Otis and Babak Parviz of Google have done is to put in a sensor on the edges of the contact lens, which measures the level of glucose in the tear fluid which bathes the contact lens, and thus monitors diabetic status in a continuous manner.
Currently it has been tried out on a series of subjects, and awaits FDA clearance for marketing and widespread use. Dr Parviz, who was earlier at the University of Washington, Seattle, had already used the contact lens as a GPS device to let the wearer know where he/she is going. This was done by putting in a tiny integrated circuit, powered by a cell phone in the pocket, and which contains a GPS set up and can voice- announce directions.
This bionic lens has wireless communication system, rf power, and transmission capability. The use of this to the visually handicapped is obvious. Such use of the contact lens as a multifunctional device would certainly have pleased da Vinci.
Shailesh shukla
directoratace@gmail.com

A new method for growing Shiitake mushroom


Among various cultivated species of mushrooms, Shiitake variety has a good demand among consumers for its taste. Particularly in Northern India consumers prefer this mushroom since it is believed to be medicinal in quality. Presently, China and Japan are the bulk producers of this prized mushroom variety.
Till some time there was no proper technology to grow this variety on a successful commercial scale but recently the Directorate of Mushroom Research (DMR), situated at Chmabaghat in Solan district and Indian Institute of Horticulture Research (IIHR), at Hassargatta near Bangalore have developed new techniques for growing this crop.
A farmer, Mr. Vikas Banyal, from Solan district, Himachal Pradesh, has further refined the growing technology by using willow wood as a substrate. He is the first farmer in the country to use willow tree wood as a substrate to grow Shiitake variety. According to Mr. Vikas this method gives a better and greater yield.
Initially Mr. Vikas was growing Shiitake on sawdust but was not able to get a good production. He got some scanty reference in literature about using willow tree logs as a medium for growing. He contacted the University which provided him willow logs for trying this as substrate.
“The substrate that is the medium is very important for mushroom cultivation. Just as a healthy soil helps good plant crop a good medium alone can help get a good yield.
“My initial attempts failed because I used sawdust that was poor in quality. It was then that some mushroom cultivation experts from the U.S. visited my farm and while interacting with them I got to know that they use logs from trees to grow mushrooms. They also advised me to try out the method on some popular growing trees in my region,” he says.
The farmer started his search for the same through several literatures and got in touch with Dr. Y. S. Parmar of the University of Horticulture and Forestry in his region.
Official help

The University was quite impressed by his dedication and perseverance and supplied him about 100 willow logs initially. From then on there was no looking back for Mr. Vikas. With an investment of just Rs, 6,000 some years back today he has established a company worth nearly Rs. 4 crores all earned from mushroom cultivation.
Explaining the procedure the enterpreneur says, “willow logs of 40 inch length and three to four inch diameter are ideal. Holes are drilled into the logs and spawns (in the form of bullets) are inserted into the holes and sealed with wax. The logs are kept in the open under shade. Fruiting of shiitake starts in just three months and continues for four to five years. The technique is cost effective and also consumes less time.”
In addition to the logs he also used the sawdust of the willow tree to grow the mushroom which proved even more effective as harvesting of the crop started in just 45 days.
He could harvest on an average 750 gm of mushroom from one kg of willow sawdust. The harvested mushrooms are fresh and fetch Rs. 200-500 per kg in the local market.
“This mushroom has a good shelf life and dried Shiitake is fetching up to Rs 2,000 per kg in the market. It can be grown in those places where temperature remains below 25 degree celsius. It can be easily grown in the hilly regions of northern, eastern and southern parts of the country.
In southern parts, cultivation can be done at Ooty, Coonoor, Chickmagalur, Kodagu and Kodaikanal. Munnar, Vagamon, Kudremukh,” says Dr. Harender Raj Gautam , Senior Scientist, Department of Plant Pathology, Dr. Y. S. Parmar University of Horticulture and Forestry, Nauni, Solan, Himachal Pradesh
Dr. N.B. Singh, Director of Extension says: “Mr. Vikas has been into mushroom cultivation since the last 25 years and has a multi-storeyed building in two acres in which he grows different varieties of this mushroom.
We had supplied the willow logs as medium for growing the shiitake mushroom. Shiitake mushroom grown on 1,000 Kg sawdust of willow gives an income of Rs 2 lakh a year.
The farmer’s income comes from selling the seeds of different varieties, compost to several farmers in Himachal Pradesh and many parts of northern India. In fact he is considered to be an authority in the state on this technology.”
Shailesh shukla
directoratace@gmail.com

Massive investments, emission cuts needed: U.N. climate science panel

‘Artificial cannabis’ may be used to reduce pain, joint inflammation of arthritis patients



Scientists have developed an artificial cannabis-like molecule that could reduce pain and joint inflammation in osteoarthritis. Researchers from the University of Nottingham developed the synthetic compound which inhibits a pain-sensing pathway in the spinal cord known as the cannabinoid receptor 2 (CB2). Although cannabis can effectively relieve pain, its use in medicine is limited because of its other psychological effects. 
Scientists have developed an artificial cannabis-like molecule that could reduce pain and joint inflammation in osteoarthritis

The compound, called JWH133, is completely synthetic but is designed to selectively target CB2 in a similar way to the drug. Levels of the CB2 receptor in the spinal cord have been shown to be closely linked to the severity of pain among osteoarthritis sufferers, ‘The Telegraph’ reported. Osteoarthritis occurs when the cartilage at the ends of bones wears away, causing joint pain and stiffness. There is no effective drug treatment to slow the progression of the condition, but interventions include pain relief, exercise, physiotherapy, weight loss and joint replacement. “This finding is significant, as spinal and brain pain signalling pathways are known to make a major contribution to pain associated with osteoarthritis,” said Professor Victoria Chapman, who led the study. “These new data support the further evaluation of the selective cannabinoid-based interventions for the treatment of osteoarthritis pain,” she said. The study was published in the Public Library of Science ONE journal.

Shailesh shukla
directoatace@gmail.com

Wednesday 22 January 2014


Milky Way galaxy may have formed inside-out


Our milky way galaxy may have formed from the inside-out, according to a ground-breaking study which provides new insight into galactic evolution. 

Data from the Gaia-ESO project has provided evidence backing theoretically predicted divisions in the chemical composition of the stars that make up the Milky Way's disc - vast collection of giant gas clouds and billions of stars that give our galaxy its 'flying saucer' shape. 

The Milky Way

The research suggests that stars in the inner regions of the galactic disc were the first to form, supporting ideas that our galaxy grew from the inside-out. 

An international team of astronomers took detailed observations of stars with a wide range of ages and locations in the galactic disc to accurately determine their 'metallicity': the amount of chemical elements in a star other than hydrogen and helium, the two elements most stars are made from. 

Immediately after the Big Bang, the universe consisted almost entirely of hydrogen and helium, with levels of "contaminant metals" growing over time. 

Consequently, older stars have fewer elements in their make-up - so have lower metallicity, researchers said. 

The team have shown that older, 'metal-poor' stars inside the Solar Circle - the orbit of our Sun around the centre of the Milky Way - are far more likely to have high levels of magnesium. 

The higher level of the element inside the Solar Circle suggests this area contained more stars that "lived fast and die young" in the past, researchers said. 



The stars that lie in the outer regions of the galactic disc are predominantly younger, both 'metal-rich' and 'metal-poor', and have surprisingly low magnesium levels compared to their metallicity. 

This discovery signifies important differences in stellar evolution across the Milky Way disc, with very efficient and short star formation timescales occurring inside the Solar Circle; whereas, outside the Sun's orbit, star formation took much longer. 

"We have been able to shed new light on the timescale of chemical enrichment across the Milky Way disc, showing that outer regions of the disc take a much longer time to form," said Maria Bergemann from Cambridge University's Institute of Astronomy, who led the study. 

The study appears in the journal Astronomy and Astrophysics. 

Shailesh shukla
directoratace@gmail.com


Increasing knowledge decreases human brain's capacity, scientists say

Age mellows all, says the famous saying. Traditionally it is thought that age leads to a steady deterioration of brain function.

However, a study has for the first time argued that older brains may take longer to process ever increasing amounts of knowledge, and this has often been misidentified as declining capacity.

Experts now say that accumulation of years of wisdom and increased knowledge about the world and surroundings slow down human brain function as they age, much like what happens to the hard drive of a computer which is full.

Older brains slow due to greater experience, rather than cognitive decline, the study now says.

The study, led by Dr Michael Ramscar of theUniversity of Tuebingen takes a critical look at the measures that are usually thought to show that our cognitive abilities decline across adulthood. Instead of finding evidence of decline, the team discovered that most standard cognitive measures are flawed, confusing increased knowledge for declining capacity.

Dr Ramscar's team used computers, programmed to act as though they were humans, to read a certain amount each day, learning new things along the way.

When the researchers let a computer "read" a limited amount, its performance on cognitive tests resembled that of a young adult.

However, if the same computer was exposed data which represented a lifetime of experiences its performance looked like that of an older adult. Often it was slower, not because its processing capacity had declined, but because increased "experience" had caused the computer's database to grow, giving it more data to process, and that processing takes time.



"What does this finding mean for our understanding of our ageing minds, for example older adults' increased difficulties with word recall? These are traditionally thought to reveal how our memory for words deteriorates with age, but Big Data adds a twist to this idea," said Dr Ramscar.

"Technology now allows researchers to make quantitative estimates about the number of words an adult can be expected to learn across a lifetime, enabling the team to separate the challenge that increasing knowledge poses to memory from the actual performance of memory itself".

"Imagine someone who knows two people's birthdays and can recall them almost perfectly. Would you really want to say that person has a better memory than a person who knows the birthdays of 2000 people, but can only match the right person to the right birthday nine times out of ten?" asks Ramscar.

Shailesh shukla
directoratace@gmail.com

No El Nino, yet 2013 fourth warmest year: US climate agency


Last year, 2013, was tied with 2003 as the fourth warmest year since records began in 1880, according to the US government's National Oceanic and Atmospheric Administration (NOAA). For the 37th consecutive year, global temperatures were higher than the 20 thcentury average.
Using the same data but calculating slightly differently, NASA said that 2013 was tied for the seventh warmest year with 2006 and 2009. The difference between 4th place and 7th place is just two-hundredths of a degree. NASA had the "temperature anomaly" - how much the global temperature deviated from the average - pegged at 0.60°C and NOAA had 0.58°C.
"The long-term trends are very clear, they're not going to disappear, and people should be aware of that," Gavin Schmidt, Deputy Chief at NASA GISS, told reporters on a conference call Tuesday.
Both agencies said nine of the 10th warmest years on record have happened in the 21st century. The hottest year was 2010, according to NOAA.
Those longer trends show the world has seen ""fairly dramatic warming"" since the 1960s with ""a smaller rate of warming over the last decade or so,"" said Thomas Karl, director of NOAA's National Climatic Data Center. In the past 50 years, the world annual temperature has increased by nearly 1.4 degrees (0.8 degrees Celsius), according to NOAA data.
Regional analysis done by NOAA shows that most areas of the world experienced above-average annual temperatures. Over land, parts of central Asia, western Ethiopia, eastern Tanzania, and much of southern and western Australia were record warm, as were sections of the Arctic Ocean, a large swath of the southwestern Pacific Ocean along with parts of the central Pacific, and an area of the central Indian Ocean. Only part of the central United States was cooler than average over land. Small regions scattered across the eastern Pacific Ocean and an area in the Southern Ocean south of South America were cooler than average. No region of the globe was record cold during 2013.
Last year, the world had 41 billion-dollar weather disasters, the second highest number behind only 2010, according to insurance firm Aon Benfield, which tracks global disasters, AP reported. Since 2000, the world has averaged 28 such billion dollar disasters, which are adjusted for inflation.
Usually the weather event called El Nino, a warming of the central Pacific, is responsible for boosting already warm years into the world's hottest years. But in 2013, there was no El Nino.
The fact that a year with no El Nino ""was so hot tells me that the climate really is shifting,"" said Andrew Dessler, a Texas A&M University climate scientist, who was not part of either the NOAA or NASA teams, according to AP.
Shailesh shukla
directoratace@gmail.com

Tuesday 21 January 2014

New drug target for cocaine addiction identified


A new molecular mechanism has been identified by researchers from the Icahn School of Medicine at Mount Sinai, through which cocaine alters the brain's reward circuits and causes addiction.
The preclinical research by Dr. Eric J. Nestler, MD, PhD, and colleagues reveals how an abundant enzyme and synaptic gene affect a key reward circuit in the brain, changing the ways genes are expressed in the nucleus accumbens.
The DNA itself does not change, but its "mark" activates or represses certain genes encoding synaptic proteins within the DNA.
The marks indicate epigenetic changes-changes made by enzymes-that alter the activity of the nucleus accumbens.

In a mouse model, the research team found that chronic cocaine administration increased levels of an enzyme called PARP-1 or poly(ADP-ribosyl)ation polymerase-1.
This increase in PARP-1 leads to an increase in its PAR marks at genes in the nucleus accumbens, contributing to long-term cocaine addiction.
Although this is the first time PARP-1 has been linked to cocaine addiction, PARP-1 has been under investigation for cancer treatment.
"This discovery provides new leads for the development of anti- addiction medications," the study's senior author, Eric Nestler, MD, PhD, Nash Family Professor of Neuroscience and Director of the Friedman Brain Institute, at the Icahn School of Medicine at Mount Sinai, said.
Dr. Nestler said that the research team is using PARP to identify other proteins regulated by cocaine. PARP inhibitors may also prove valuable in changing cocaine's addictive power.

Shailesh kr shukla
directoratace@gmail.com


Friday 17 January 2014

Specific Heat

The specific heat is the amount of heat per unit mass required to raise thetemperature by one degree Celsius. The relationship between heat and temperature change is usually expressed in the form shown below where c is the specific heat. The relationship does not apply if a phase change is encountered, because the heat added or removed during a phase change does not change the temperature.

The specific heat of water is 1 calorie/gram °C = 4.186 joule/gram °C which is higher than any other common substance. As a result, water plays a very important role in temperature regulation. The specific heat per gram for water is much higher than that for a metal, as described in the water-metal example. For most purposes, it is more meaningful to compare the molar specific heats of substances.
The molar specific heats of most solids at room temperature and above are nearly constant, in agreement with the Law of Dulong and Petit. At lower temperatures the specific heats drop as quantum processes become significant. The low temperature behavior is described by the Einstein-Debye model of specific heat.

The specific heat (also called specific heat capacity) is the amount of heat required to change a unit mass (or unit quantity, such as mole) of a substance by one degree in temperature. Therefore, unlike the extensive variable heat capacity, which depends on the quantity of material, specific heat is an intensive variable and has units of energy per mass per degree (or energy per number of moles per degree).
The heat capacity of a substance can differ depending on what extensive variables are held constant, with the quantity being held constant usually being denoted with a subscript. For example, the specific heat at constant pressure is commonly denoted , while the specific heat at constant volume is commonly denoted . The specific heat of water at constant atmospheric pressure is
(1)
i.e., 1 calorie is needed per degree Kelvin (or Celsius) of temperature change for 1 gram of liquid water. In fact, the definition of (one of the several types of) the calorie is the amount of heat needed to change thetemperature of 1 g of water by 1  at its temperature of maximum density (roughly 3.98° C).
The heat capacity ratio is defined as the ratio of specific heats of a substance at constant pressure and constant volume,



Specific heat is another physical property of matter. All matter has a temperature associated with it. The temperature of matter is a direct measure of the motion of the molecules: The greater the motion the higher the temperature:
Motion requires energy: The more energy matter has the higher temperature it will also have. Typicall this energy is supplied by heat. Heat loss or gain by matter is equivalent energy loss or gain.
With the observation above understood we con now ask the following question: by how much will the temperature of an object increase or decrease by the gain or loss of heat energy? The answer is given by the specific heat (S) of the object. The specific heat of an object is defined in the following way: Take an object of mass m, put in x amount of heat and carefully note the temperature rise, then S is given by
In this definition mass is usually in either grams or kilograms and temperatture is either in kelvin or degres Celcius. Note that the specific heat is "per unit mass". Thus, the specific heat of a gallon of milk is equal to the specific heat of a quart of milk. A related quantity is called the heat capacity (C). of an object. The relation between S and C is C = (mass of obect) x (specific heat of object). A table of some common specific heats and heat capacities is given below:
Some common specific heats and heat capacities:
 Substance S (J/g0C) C (J/0C) for 100 g
 Air 1.01 101
 Aluminum 0.902 90.2
 Copper 0.385 38.5
 Gold 0.129 12.9
 Iron 0.450 45.0
 Mercury 0.140 14.0
 NaCl 0.864 86.4
 Ice 2..03 203
 Water 4.179 41.79
   
Consider the specific heat of copper , 0.385 J/g 0C. What this means is that it takes 0.385 Joules of heat to raise 1 gram of copper 1 degree celcius. Thus, if we take 1 gram of copper at 25 0C and add 1 Joule of heat to it, we will find that the temperature of the copper will have risen to 26 0C. We can then ask: How much heat wil it take to raise by 1 0C 2g of copper?. Clearly the answer is 0.385 J for each gram or 2x0.385 J = 0.770 J. What about a pound of copper? A simple way of dealing with different masses of matter is to dtermine the heat capacity C as defined above. Note that C depends upon the size of the object as opposed to S that does not.
We are not in position to do some calculations with S and C.
Example 1: How much energy does it take to raise the temperature of 50 g of copper by 10 0C?
Example 2: If we add 30 J of heat to 10 g of aluminum, by how much will its temperature increase?
 
Thus, if the initial temperture of the aluminum was 20 0C then after the heat is added the temperature will be 28.3 0C.

Converting between Common Units

  • 1 Btu/lbmoF = 4186.8 J/kg K = 1 kcal/kgoC

Example - Heating Aluminum

2 kg of aluminum is heated from 20 oC to 100 oCSpecific heat of aluminum is 0.91 kJ/kg0C and the heat required can be calculated as
dQ = (2 kg) (0.91 kJ/kg0C) ((100 oC) - (20 oC)) 
     = 145.6 (kJ)

Example - Heating Water

One litre of water is heated from oC to boiling 100 oCSpecific heat of water is 4.19 kJ/kg0C and the heat required can be calculated as
dQ = (1 litre) (1 kg/litre) (4.19 kJ/kg0C) ((100 oC) - (0 oC)) 
     = 419 (kJ)

Specific Heat Gases

There are two definitions of Specific Heat for vapors and gases:
cp = (δh / δT)p - Specific Heat at constant pressure (kJ/kgoC)
cv = ( δh / δT)v - Specific Heat at constant volume (kJ/kgoC)

Gas Constant

 The gas constant can be expressed as
R = cp - cv         (2)
where

Ratio of Specific Heat

The Ratio of Specific Heat is expressed
k = cp / cv         (3)
Shailesh  kr shukla
directoratace@gmail.com