.......o0O.......Thomas......O0o......
Thứ Ba, 8 tháng 3, 2011
Collaborative Statistics
Summary:
Collaborative Statistics was written by Barbara Illowsky and Susan Dean, faculty members at De Anza College in Cupertino, California. The textbook was developed over several years and has been used in regular and honors-level classroom settings and in distance learning classes. This textbook is intended for introductory statistics courses being taken by students at two– and four–year colleges who are majoring in fields other than math or engineering. Intermediate algebra is the only prerequisite. The book focuses on applications of statistical knowledge rather than the theory behind it.
This collection contains modules by: Barbara Illowsky, Ph.D., Susan Dean.
Link Download: Collaborative Statistics
Password: tranhoangvulk@gmail.com
Thứ Hai, 7 tháng 2, 2011
Engineers grow nanolasers on silicon, pave way for on-chip photonics
Engineers at the University of California, Berkeley, have found a way to grow nanolasers directly onto a silicon surface, an achievement that could lead to a new class of faster, more efficient microprocessors, as well as to powerful biochemical sensors that use optoelectronic chips.
They describe their work in a paper to be published Feb. 6 in an advanced online issue of the journal Nature Photonics.
The unique structure of the nanopillars grown by UC Berkeley researchers strongly confines light in a tiny volume to enable subwavelength nanolasers. Images on the left and top right show simulated electric field intensities that describe how light circulates helically inside the nanopillars. On the bottom right is an experimental camera image of laser light from a single nanolaser. (Credit: Connie Chang-Hasnain Group)
"Our results impact a broad spectrum of scientific fields, including materials science, transistor technology, laser science, optoelectronics and optical physics," said the study's principal investigator, Connie Chang-Hasnain, UC Berkeley professor of electrical engineering and computer sciences.
The increasing performance demands of electronics have sent researchers in search of better ways to harness the inherent ability of light particles to carry far more data than electrical signals can. Optical interconnects are seen as a solution to overcoming the communications bottleneck within and between computer chips.
Because silicon, the material that forms the foundation of modern electronics, is extremely deficient at generating light, engineers have turned to another class of materials known as III-V (pronounced "three-five") semiconductors to create light-based components such as light-emitting diodes (LEDs) and lasers.
But the researchers pointed out that marrying III-V with silicon to create a single optoelectronic chip has been problematic. For one, the atomic structures of the two materials are mismatched.
"Growing III-V semiconductor films on silicon is like forcing two incongruent puzzle pieces together," said study lead author Roger Chen, a UC Berkeley graduate student in electrical engineering and computer sciences. "It can be done, but the material gets damaged in the process."
Moreover, the manufacturing industry is set up for the production of silicon-based materials, so for practical reasons, the goal has been to integrate the fabrication of III-V devices into the existing infrastructure, the researchers said.
"Today's massive silicon electronics infrastructure is extremely difficult to change for both economic and technological reasons, so compatibility with silicon fabrication is critical," said Chang-Hasnain. "One problem is that growth of III-V semiconductors has traditionally involved high temperatures -- 700 degrees Celsius or more -- that would destroy the electronics. Meanwhile, other integration approaches have not been scalable."
The UC Berkeley researchers overcame this limitation by finding a way to grow nanopillars made of indium gallium arsenide, a III-V material, onto a silicon surface at the relatively cool temperature of 400 degrees Celsius.
"Working at nanoscale levels has enabled us to grow high quality III-V materials at low temperatures such that silicon electronics can retain their functionality," said Chen.
The researchers used metal-organic chemical vapor deposition to grow the nanopillars on the silicon. "This technique is potentially mass manufacturable, since such a system is already used commercially to make thin film solar cells and light emitting diodes," said Chang-Hasnain.
Once the nanopillar was made, the researchers showed that it could generate near infrared laser light -- a wavelength of about 950 nanometers -- at room temperature. The hexagonal geometry dictated by the crystal structure of the nanopillars creates a new, efficient, light-trapping optical cavity. Light circulates up and down the structure in a helical fashion and amplifies via this optical feedback mechanism.
The unique approach of growing nanolasers directly onto silicon could lead to highly efficient silicon photonics, the researchers said. They noted that the miniscule dimensions of the nanopillars -- smaller than one wavelength on each side, in some cases -- make it possible to pack them into small spaces with the added benefit of consuming very little energy
"Ultimately, this technique may provide a powerful and new avenue for engineering on-chip nanophotonic devices such as lasers, photodetectors, modulators and solar cells," said Chen.
"This is the first bottom-up integration of III-V nanolasers onto silicon chips using a growth process compatible with the CMOS (complementary metal oxide semiconductor) technology now used to make integrated circuits," said Chang-Hasnain. "This research has the potential to catalyze an optoelectronics revolution in computing, communications, displays and optical signal processing. In the future, we expect to improve the characteristics of these lasers and ultimately control them electronically for a powerful marriage between photonic and electronic devices."
The Defense Advanced Research Projects Agency and a Department of Defense National Security Science and Engineering Faculty Fellowship helped support this research.
Source: ScienceDaily
Thứ Ba, 25 tháng 1, 2011
The MIT - AAAS Forum on Convergence
January 4, 2011 at the American Association for the Advancement of Science (AAAS) Auditorium
Moderated by Dr. Alan Leshner, AAAS Chief Executive Officer and Executive Publisher, Science.
Panel I - The Promise of Convergence.
Panelists: Dr. Phillip Sharp, Dr. Tyler Jacks, Dr. Paula Hammond and Dr. Robert Langer.
Panel II - The Future of Biomedical Research and Medicine in the Age of Convergence
Dr. Margaret Hamburg, Dr. Alan Guttmacher, Mr. Tom Kalil, Dr. Keith Yamamoto
Moderated by Dr. Alan Leshner, AAAS Chief Executive Officer and Executive Publisher, Science.
Panel I - The Promise of Convergence.
Panelists: Dr. Phillip Sharp, Dr. Tyler Jacks, Dr. Paula Hammond and Dr. Robert Langer.
Panel II - The Future of Biomedical Research and Medicine in the Age of Convergence
Dr. Margaret Hamburg, Dr. Alan Guttmacher, Mr. Tom Kalil, Dr. Keith Yamamoto
Thứ Hai, 24 tháng 1, 2011
Algebraic Topology
Author: Allen Hatcher
This is the first in a planned series of three textbooks in algebraic topology having the goal of covering all the basics while remaining readable by newcomers seeing the subject for the first time. The first book contains the basic core material along with a number of optional topics of a relatively elementary nature.
Link Download: Algebraic Topology
Password: tranhoangvulk@gmail.com
This is the first in a planned series of three textbooks in algebraic topology having the goal of covering all the basics while remaining readable by newcomers seeing the subject for the first time. The first book contains the basic core material along with a number of optional topics of a relatively elementary nature.
Link Download: Algebraic Topology
Password: tranhoangvulk@gmail.com
Royal Society meets to weigh up the shrinking kilogram
Scientists look at alternatives to the mass of platinum used as international standard measure, which has lost 50 micrograms
For more than a century, all measurements of weight have been defined in relation to a lump of metal sitting in Paris. The "international prototype" kilogram has been at the heart of trade and scientific experiment since 1889, but now experts want to get rid of it.
Today, scientists will meet at the Royal Society in London to discuss how to bring the kilogram into the 21st century, by defining this basic unit of measurement in terms of the fundamental constants of nature, rather than a physical object.
"The kilogram is still defined as the mass of a piece of platinum which, when I was director of the International Bureau of Weights and Measures, I had in a safe in my lab," said Terry Quinn, an organiser of today's meeting. "It's a cylinder of platinum-iridium about 39mm high, 39mm in diameter, cast by Johnson Matthey in Hatton Garden in 1879, delivered to the International Committee on Weights and Measures in Sevres shortly afterwards, polished and adjusted to be made equal in mass to the mass of the old French kilogram of the archives which dates from the time of the French Revolution. Then, in 1889, it was adopted by the first general conference for weights and measures as the international prototype of the kilogram."Many of the other units of scientific measurement rely on the standard definition of the kilogram. A newton of force, for example, is the amount required to accelerate one kilogram at one metre per second squared. The unit of pressure, the pascal, is defined as one newton per unit metre squared.
One problem with using a lump of metal to define such a basic quantity as the kilogram is that it is liable to change over time. Measurements over the past century have shown that the international prototype has lost around 50 micrograms, around the weight of a grain of sand.
"Why should it [the current standard] be stable? It's a piece of platinum cast in London 130 years ago, full of holes, full of hydrogen," said Quinn. "What's on the surface, it's impossible to know. There are all sorts of surface layers of hydrocarbons."
Instead, experts want to link the kilogram to a fundamental unit of measurement in quantum physics, the Planck constant. Using a device called a watt balance, scientists can relate the mass of an object to the electrical energy needed to move it, using the Planck constant.
This redefinition would bring the kilogram into line with the six other base units that make up the International System of Units (SI) – the metre, the second, the ampere, the kelvin, the mole and the candela. None of these are now based on a physical reference object – the metre is defined in terms of the speed of light, for example, while the second is based on atomic clocks.
Any proposals to change the definition of the kilogram would have to be agreed at the General Conference on Weights and Measures, due to meet in Paris later this year.
Source: http://www.guardian.co.uk/
For more than a century, all measurements of weight have been defined in relation to a lump of metal sitting in Paris. The "international prototype" kilogram has been at the heart of trade and scientific experiment since 1889, but now experts want to get rid of it.
Today, scientists will meet at the Royal Society in London to discuss how to bring the kilogram into the 21st century, by defining this basic unit of measurement in terms of the fundamental constants of nature, rather than a physical object.
"The kilogram is still defined as the mass of a piece of platinum which, when I was director of the International Bureau of Weights and Measures, I had in a safe in my lab," said Terry Quinn, an organiser of today's meeting. "It's a cylinder of platinum-iridium about 39mm high, 39mm in diameter, cast by Johnson Matthey in Hatton Garden in 1879, delivered to the International Committee on Weights and Measures in Sevres shortly afterwards, polished and adjusted to be made equal in mass to the mass of the old French kilogram of the archives which dates from the time of the French Revolution. Then, in 1889, it was adopted by the first general conference for weights and measures as the international prototype of the kilogram."Many of the other units of scientific measurement rely on the standard definition of the kilogram. A newton of force, for example, is the amount required to accelerate one kilogram at one metre per second squared. The unit of pressure, the pascal, is defined as one newton per unit metre squared.
One problem with using a lump of metal to define such a basic quantity as the kilogram is that it is liable to change over time. Measurements over the past century have shown that the international prototype has lost around 50 micrograms, around the weight of a grain of sand.
"Why should it [the current standard] be stable? It's a piece of platinum cast in London 130 years ago, full of holes, full of hydrogen," said Quinn. "What's on the surface, it's impossible to know. There are all sorts of surface layers of hydrocarbons."
Instead, experts want to link the kilogram to a fundamental unit of measurement in quantum physics, the Planck constant. Using a device called a watt balance, scientists can relate the mass of an object to the electrical energy needed to move it, using the Planck constant.
This redefinition would bring the kilogram into line with the six other base units that make up the International System of Units (SI) – the metre, the second, the ampere, the kelvin, the mole and the candela. None of these are now based on a physical reference object – the metre is defined in terms of the speed of light, for example, while the second is based on atomic clocks.
Any proposals to change the definition of the kilogram would have to be agreed at the General Conference on Weights and Measures, due to meet in Paris later this year.
Source: http://www.guardian.co.uk/
Chủ Nhật, 23 tháng 1, 2011
Protein Detection Using Arrayed Microsensor Chips: Tuning Sensor Footprint to Achieve Ultrasensitive Readout of CA-125 in Serum and Whole Blood
Jagotamoy Das and Shana O. Kelley
Department of Pharmaceutical Sciences, Leslie Dan Faculty of Pharmacy, and Department of Biochemistry, Faculty of Medicine, University of Toronto, Toronto, Canada
Anal. Chem., Article ASAP
DOI: 10.1021/ac102917f
Publication Date (Web): January 18, 2011
Copyright © 2011 American Chemical Society
Department of Pharmaceutical Sciences, Leslie Dan Faculty of Pharmacy, and Department of Biochemistry, Faculty of Medicine, University of Toronto, Toronto, Canada
Anal. Chem., Article ASAP
DOI: 10.1021/ac102917f
Publication Date (Web): January 18, 2011
Copyright © 2011 American Chemical Society
Abstract
Multiplexed assays that can measure protein biomarkers and internal standards are highly desirable given the potential to reduce false positives and negatives. We report here the use of a chip-based platform that achieves multiplexed immunosensing of the ovarian cancer biomarker CA-125 without the need for covalent labeling or sandwich complexes. The sensor chips allow the straightforward comparison of detectors of different sizes, and we used this feature to scan the microscale size regime for the best sensor size and optimize the limit of detection exhibited down to 0.1 U/mL. The assay has a straightforward design, with readout being performed in a single step involving the introduction of a noncovalently attached redox reporter group. The detection system reported exhibits excellent specificity, with analysis of a specific cancer biomarker, CA-125, performed in human serum and whole blood. The multiplexing of the system allows the analysis of the biomarker to be performed in parallel with an abundant serum protein for internal calibration.
Link Download: Protein Detection Using Arrayed Microsensor Chips
Password: tranhoangvulk@gmail.com
Virtual screening, identification and experimental testing of novel inhibitors of PBEF1/Visfatin/NMPRTase for glioma therapy
Authors: Nagasuma Chandra, Raghu Bhagavat, Eshita Sharma, P Sreekanthreddy and Kumaravel Somasundaram
Abstract (provisional)
Background
Pre-B-cell colony enhancing factor 1 gene (PBEF1) encodes nicotinamide phosphoribosyltransferase (NMPRTase), which catalyses the rate limiting step in the salvage pathway of NAD+ metabolism in mammalian cells. PBEF1 transcript and protein levels have been shown to be elevated in glioblastoma and a chemical inhibitor of NMPRTase has been shown to specifically inhibit cancer cells.
Methods
Virtual screening using docking was used to screen a library of more than 13,000 chemical compounds. A shortlisted set of compounds were tested for their inhibition activity in vitro by an NMPRTase enzyme assay. Further, the ability of the compounds to inhibit glioma cell proliferation was carried out.
Results
Virtual screening resulted in short listing of 34 possible ligands, of which six were tested experimentally, using the NMPRTase enzyme inhibition assay and further with the glioma cell viability assays. Of these, two compounds were found to be significantly efficacious in inhibiting the conversion of nicotinamide to NAD+, and out of which, one compound, 3-amino-2-benzyl-7-nitro-4-(2-quinolyl-)-1,2-dihydroisoquinolin-1-one, was found to inhibit the growth of a PBEF1 over expressing glioma derived cell line U87 as well.
Conclusions
Thus, a novel inhibitor has been identified through a structure based drug discovery approach and is further supported by experimental evidence.
Link Download: PBEF1-Visfatin-NMPRTase
Password: tranhoangvulk@gmail.com
Đăng ký:
Bài đăng (Atom)