راغب السرجانى

الأربعاء، 30 يونيو 2010

Lightning

Lightning is an atmospheric discharge of electricity accompanied by thunder, which typically occurs during thunderstorms, and sometimes during volcanic eruptions or dust storms.[1] In the atmospheric electrical discharge, a leader of a bolt of lightning can travel at speeds of 60,000 m/s (130,000 mph), and can reach temperatures approaching 30,000 °C (54,000 °F), hot enough to fuse silica sand into glass channels known as fulgurites which are normally hollow and can extend some distance into the ground.[2][3] There are some 16 million lightning storms in the world every year.[4]

Lightning can also occur within the ash clouds from volcanic eruptions, or can be caused by violent forest fires which generate sufficient dust to create a static charge.[1][5]

How lightning initially forms is still a matter of debate:[6] Scientists have studied root causes ranging from atmospheric perturbations (wind, humidity, friction, and atmospheric pressure) to the impact of solar wind and accumulation of charged solar particles.[4] Ice inside a cloud is thought to be a key element in lightning development, and may cause a forcible separation of positive and negative charges within the cloud, thus assisting in the formation of lightning.[4]

The irrational fear of lightning (and thunder) is astraphobia. The study or science of lightning is called fulminology, and someone who studies lightning is referred to as a fulminologist.

Lightning is a bright light that appears suddenly in the heart of the sky in the days with the worst weather conditions, a light arising as a result of a collision Shaptin one cargo-carrying electric negative and the other carrying the electric charge of positive and thus the resulting collision spark a powerful issued in the form of light that we see a sudden then disappears
Flankchw through this film


Floods

While she shows its teeth water, they become more deadly natural phenomenon on earth ... the flood. Floods move very quickly and can destroy all that is offset by its road building, land and extract ingestion of houses and mud and debris. Floods destroy the other cars and people and destroying crops and anarchy in the regions. Join us in a scientific expedition to Tcetkhvoa occur and how can we do to survive them


Geography

Geography (from Greek γεωγραφία - geographia, lit. "earth describe-write".is the study of the Earth and its lands, features, inhabitants, and phenomena.A literal translation would be "to describe or write about the Earth". The first person to use the word "geography" was Eratosthenes (276-194 B.C.). Four historical traditions in geographical research are the spatial analysis of natural and human phenomena (geography as a study of distribution), area studies (places and regions), study of man-land relationship, and research in earth sciencesNonetheless, modern geography is an all-encompassing discipline that foremost seeks to understand the Earth and all of its human and natural complexities—not merely where objects are, but how they have changed and come to be. As "the bridge between the human and physical sciences," geography is divided into two main branches—human geography and physical geography
Introduction

Traditionally, geographers have been viewed the same way as cartographers and people who study place names and numbers. Although many geographers are trained in toponymy and cartology, this is not their main preoccupation. Geographers study the spatial and temporal distribution of phenomena, processes and features as well as the interaction of humans and their environment.[6] As space and place affect a variety of topics such as economics, health, climate, plants and animals, geography is highly interdisciplinary.
“ ...mere names of places...are not geography...know by heart a whole gazetteer full of them would not, in itself, constitute anyone a geographer. Geography has higher aims than this: it seeks to classify phenomena (alike of the natural and of the political world, in so far as it treats of the latter), to compare, to generalize, to ascend from effects to causes, and, in doing so, to trace out the great laws of nature and to mark their influences upon man. This is 'a description of the world'—that is Geography. In a word Geography is a Science—a thing not of mere names but of argument and reason, of cause and effect.[7] ”

— William Hughes, 1863

Geography as a discipline can be split broadly into two main subsidiary fields: human geography and physical geography. The former focuses largely on the built environment and how space is created, viewed and managed by humans as well as the influence humans have on the space they occupy. The latter examines the natural environment and how the climate, vegetation & life, soil, water, and landforms are produced and interact.[8] As a result of the two subfields using different approaches a third field has emerged, which is environmental geography. Environmental geography combines physical and human geography and looks at the interactions between the environment and humans.
Branches of geography
Physical geography (or physiography) focuses on geography as an Earth science. It aims to understand the physical lithosphere, hydrosphere, atmosphere, pedosphere, and global flora and fauna patterns (biosphere). Physical geography can be divided into the following broad categories:
Human geography
Human geography is a branch of geography that focuses on the study of patterns and processes that shape human interaction with various environments. It encompasses human, political, cultural, social, and economic aspects. While the major focus of human geography is not the physical landscape of the Earth (see physical geography), it is hardly possible to discuss human geography without referring to the physical landscape on which human activities are being played out, and environmental geography is emerging as a link between the two. Human geography can be divided into many broad categories, such as:
Environmental geography
Environmental geography is the branch of geography that describes the spatial aspects of interactions between humans and the natural world. It requires an understanding of the traditional aspects of physical and human geography, as well as the ways in which human societies conceptualize the environment.

Environmental geography has emerged as a bridge between human and physical geography as a result of the increasing specialisation of the two sub-fields. Furthermore, as human relationship with the environment has changed as a result of globalization and technological change a new approach was needed to understand the changing and dynamic relationship. Examples of areas of research in environmental geography include emergency management, environmental management, sustainability, and political ecology.
Geomatics

Geomatics is a branch of geography that has emerged since the quantitative revolution in geography in the mid 1950s. Geomatics involves the use of traditional spatial techniques used in cartography and topography and their application to computers. Geomatics has become a widespread field with many other disciplines using techniques such as GIS and remote sensing. Geomatics has also led to a revitalization of some geography departments especially in Northern America where the subject had a declining status during the 1950s.

Geomatics encompasses a large area of fields involved with spatial analysis, such as Cartography, Geographic information systems (GIS), Remote sensing, and Global positioning systems (GPS).
Regional geography
Regional geography is a branch of geography that studies the regions of all sizes across the Earth. It has a prevailing descriptive character. The main aim is to understand or define the uniqueness or character of a particular region which consists of natural as well as human elements. Attention is paid also to regionalization which covers the proper techniques of space delimitation into regions.

Regional geography is also considered as a certain approach to study in geographical sciences (similar to quantitative or critical geographies, for more information see History of geography).
Related fields

* Urban planning, regional planning and spatial planning: use the science of geography to assist in determining how to develop (or not develop) the land to meet particular criteria, such as safety, beauty, economic opportunities, the preservation of the built or natural heritage, and so on. The planning of towns, cities, and rural areas may be seen as applied geography.
* Regional science: In the 1950s the regional science movement led by Walter Isard arose, to provide a more quantitative and analytical base to geographical questions, in contrast to the descriptive tendencies of traditional geography programs. Regional science comprises the body of knowledge in which the spatial dimension plays a fundamental role, such as regional economics, resource management, location theory, urban and regional planning, transport and communication, human geography, population distribution, landscape ecology, and environmental quality.
* Interplanetary Sciences: While the discipline of geography is normally concerned with the Earth, the term can also be informally used to describe the study of other worlds, such as the planets of the Solar System and even beyond. The study of systems larger than the earth itself usually forms part of Astronomy or Cosmology. The study of other planets is usually called planetary science. Alternative terms such as Areology (the study of Mars) have been proposed, but are not widely used.


2

quantum Intangel Net

The world will witness a technological revolution called quantum Intangel Net will transmit the Internet for retirement






The Egyptian scientist Ahmed Zewail for
A new revolution will be online with this form
An old classic method called the "Quantum Intangel Net"


The monitoring of the visual world is not Kaldzeiat speeds is
M to search through the Internet.

Without wires
And cables
Or communication lines, and will be used through the transition of atoms
And vibrations to turn to light through the air to move into atoms
And vibrations in another country without any wires directly,
This is something that will give everyone in the world
Finite speed of search and privacy
Keeping secrets and information




He added that this method
Modern "quantum" to monitor the world invisible





Used by developed countries in economy and even in war,





They advanced methods for Einstein's theories
Classical movement, which monitors the movement of the common man.



He gave an example of what the world saw during the war in Gaza when Israeli technology use "quantum" Calfsfor white, and a dime a million daggers are sharp and severe micro capable of genital human and Thtkha

1001withtranslation

Or you today in the face to me, is a gift to the whole world and film is 1001 invention and I want you to look into this product perspective and a good vision is Yahya for scientists and inventions that led to our modern life and I hope you watch the fun of your friend Mohamed Ahmed

NASA denies rumors that confirms that the end of life in 2012



Confirmed the U.S. space agency NASA rumors spread by people and websites that the end of life on Earth will be in the month of December in 2012, has stated specialist space science, David Morrison, said he does not have NASA any information about the existence of the planet a huge about to crash land in the that date.



Morrison noted that such stories are handled consistently by the people and that they are not based on scientific fact, but is in fact a great trick started from the Legends of the ancient Mayan civilization in Central America.



Mentioned here that the rumors that are currently circulating confirm that the planet anonymously called Nipro, will collide with the globe in 2012, and that the earth will start spinning the opposite will build solar storms and bubbling volcanoes and permeated albumin land and melting ice, and other extremes of climate variation.



Around that scientists say that there is no planet named Nipro and astronomical studies did not reveal the presence of objects or celestial bodies orbit will intersect with the Earth's orbit in the near future, and the possibility of occurrence of such severe climatic changes at once is not possible.

the moon Titan



Moon of Saturn's Titan like Earth to a large extent
The latest scientific research from the U.S. space agency (NASA) that one of Saturn's moons, a moon, Titan, similar to Earth than any planet or other celestial configuration in the solar system.



The scientists based their conclusions on an analysis of this data and information sent by the Cassini space probe.



These data showed that the moon Titan which Tdharyis mountains, and valleys of sand, and many lakes, and perhaps even volcanoes, which in total makes him a very close resemblance of the planet.



Titan away about ten times the distance between the earth and the sun, it also much cooler, reaching temperatures of about 180 degrees Celsius, but there is no water in it.



The information also indicated that chemicals such as methane and ethane has taken the place of water in the moon.



And fall of these materials is also falling rain or snow in this band, made up lakes, which makes Tksh similar to a large extent weather the ground.




Photographer section below shows the moon Titan, and a trip identified

الثلاثاء، 29 يونيو 2010

Electricity

Abu Edison incandescent lamps - bright miracle

As small as the genius Thomas Edison, and in one dark nights intensified disease on his mother and the doctor decided that they need surgery but must wait until the morning, it needs to be lit to work, and this was a moment of truth in the life of Thomas Edison, who knew that the lives of the most expensive people it may stop the flame of light, Vstr, Thomas Edison in his diary he must find a way to get light at night is stronger than the light of candles, has these words exist in the diary for years, long before they begin research to come out Bmadzth genius, which marked a turning point in the history of modern civilization.

The beginning of the dream

Accept light bulbs out of commercial interest for a long time, he offered physicist Humphrey Davy in 1810 lamp electric arc, which consists of two rods Mudbpin from coal, and applied to the penis voltage electricity is high, and ignite the electric arc strongly approaching Alqdiban from each other, but this invention he has no commercial value due to the lack of sources of electricity strong. The American Charles Brush invented in 1876, arc lights burning strongly, but it was used to illuminate the main streets just because they have a high volume like zooms, and lighting was so strong, and fit only for a few hours and then burn. In 1877 he visited Edison's first laboratory for the Dynamo, who brought him a new idea to produce a lamp glowing architecture of the press look to the future: (The day will come when we can light all the homes and run all plants by electrical appliances, and just touch of a button to get the light in the darkness). Following this statement under the Edison works with 50 working day and night in Menlo Park to achieve the objective which is believed science is impossible to reach, and focus the search for a wire heat runs a long time, and try Edison minerals and one after the other, use to start a thread of carborundum to give the light of white for ten minutes before the burn, and this is the first incandescent lamp, Edison continued to experiment with metal chrome and platinum.

The first light bulb for Thomas Edison

On October 21, 1879 Breathe Edison sigh of relief, where he put a piece of cotton yarn carburetor into a crystal ball and to benefit from past experiences Try Edison to empty the bottle from the air, and then run a power Vetoahj fuse, and continued attending the lamp by the hour and everyone is minutes waiting for combustion , but he continued, this time for 45 hours before burning, and here cheered for everyone, but Edison told his aides that the lamp as long as the work of this period, it can be illuminated for a hundred hours, has been looking for stuff best to make the fuse, he discovered that bamboo is the best material for this purpose, After years according to the manufacture of wire from a combination of fiber extracted from plants, is still used until the mix today, but it Altanjstin coated metal resistant to heat. Commenting on the Edison lamps laboratory for experimental purposes, and spread the news that the magician has accomplished the miracle, and in the New Year's Eve was attended by three thousand visitors to see the lamps that burned until dawn on the first day in 1880, becoming the name of Abu Edison incandescent lamps. And they became the genius in the next phase to become lights of the right of everyone, continue to work on the electricity sub-central to September 1882, before turning the key main lights 400 lamp at the same time, declaring the end of the era of the saddle and Alotarik and darkness, and the beginning of electricity and light. Says physicist Larsen that stage: (Edison established the scientific method to provide homes with electricity and light after the United States, Europe, the people heard in this great event, the people were Irusselouna, saying: Come enlighten our cities.). The world war broke out between the British and Wilson Swan, who made several unsuccessful offers to incandescent lamps, the first in February 3rd, 1879, but the feelings of competition still quick and Swan joined the Edison Company for the manufacture of lamps (Edison electric lighting and heating). The aim of Edison at this stage is to produce a million lights electric, and this company was a great place to attract talent and skills training, they come out on his hands with a lot of innovation and skill, including the Serbian inventor Nikola Tesla Nikola Tesla.

Thomas Edison (1847 - 1931) American inventor invented the light bulb and about 1093 the invention of another.

Of his statements:

There is always a better way.

There is no substitute for hard work.

Be brave! Shown faith! On the move.

We do not know one-millionth of anything.

Great hopes make great people.

To invent you need a good imagination and a pile of junk.

I'm proud of the fact that I have not invented a weapon kills.

I did not fail but I found ten thousand method does not work.

Genius is one percent inspiration and 99 percent effort.

Discovered 100 ways that do not lead to the invention of the battery and tried 9999 times to make the light bulb.


Lien: Innovations human light bulb

Egypt Atomic Scientists

Samir Naguib


World is Samir Naguib Egyptian nuclear scientist in the forefront of the younger generation of nuclear scientists Arabs, graduated from the Faculty of Science, University of Cairo at an early age, and continued his scientific researches in the corn. For efficiency scientific characteristic has been nominated to the United States of America on a mission, and work under the supervision of professors nature of nuclear physics and the age does not exceed thirty-third, showed a gifted, distinguished and genius substantially during the examination which was prepared in the mid-sixties - during his mission to America - to the extent that it had completed the preparation of the letter before schedule one whole year.
Coincidentally, the University announced "Detroit" America contest for the post of assistant professor in natural science, and provides for the contest more than two hundred nuclear scientist from different nationalities, won by Dr. Samir Naguib, and got a job as assistant professor at the University, began his research study that received the admiration of many Americans, and aroused the concern of the Zionists and the pro-Zionist groups in America. As usual, have been coming to Dr. offers material for the development of research, but particularly after the war in June 1967 felt that his country and his country in need. And designed the world to return to Egypt and booking seats on a flight to Cairo on 13/8/1967.
Once, Dr.. Samir for his views even made him many American requests him not to travel, and offered him the temptation of scientific and multi-material to stay in the United States. But Dr. Samir Najib rejected all temptations offered to him. On the night set for his return to Egypt, has moved the hostile forces of Egypt and the Arab nation, these forces are resolved to destroy the structure of modern Arab scientific Whatever the motives and whatever the results. In the city of Detroit, while he was driving his car, Dr. Samir and high hopes revolve in his mind and his head, dreaming of a return to his homeland for his efforts and his research Alyalmswolin, and then see his family after an absence.
In the road were surprised Dr. Samir Naguib car transfer huge, initially thought to be going on the road like the rest of the cars. Tried to cut in any doubt, veered to the side of the road but found that the car following him. At the moment of tragic rushed the truck and then increased speed and hit a Dr. who crashed his car and died immediately, and began the transport vehicle and accompanying driver disappeared, and under the incident against unknown and lost the Arab nation the great scholar could give his country and his nation a lot in a field of corn


One of the scientists whose bodies were found covered with question marks. He died in January 16, 1950 in a manner very primitive .. Poison.
Wali. Mustafa supervisor first Egyptian to participate in space research, but more importantly it was one of the world's students, Albert Einstein, was one of the key aides in the access to the theory of relativity, and fired on d. Honorable title of "Einstein's Arabs," and "the circumstances of the death of Dr. Supervisor sudden too vague, and was all the circumstances surrounding it indicates that murdered either by a representative of King Farouk, or at the hands of international Zionism, and each reason.
May have a monarchy at that time the Egyptian role in the killing, especially if we know that d. Supervisor had set up a group under the name "Egypt's youth", there were a large number of intellectuals, scientists, students, and was designed to exclude system Farouk monarchy and the Declaration of an independent Arab Republic of Egypt. And news of the group is confidential and news reached the royal palace, which gives the palace an excuse to get rid of d. Mustafa. The World Zionist suffice it to say that their view of the student genius d. Samira Moussa would not differ from their perception of the teacher is the most gifted d. Mustafa supervisor, and played Zionism played a dirty physical liquidation, and was one look at means to get rid of them and their ilk.
And Mustafa supervisor is the world of mathematics and physics is an Egyptian born in 1898. And in 1917 was chosen for a scientific mission for the first time to England. He joined "to" College Nottingham Nottingham, and the Faculty of the "King" in London, where he received a Bachelor of Science with Honours in 1923. And holds a Ph.D (Doctor of Philosophy) from the University of London, in the shortest time permitted by the laws of the Whole.
Has returned to Egypt by order of the ministry, and appointed a teacher at Normal Superior, then traveled back to England and obtained a doctorate of Science D. Sc was the first Egyptian to receive them.
In 1925 he returned to Egypt, and was appointed professor of Applied Mathematics, Faculty of Science, Cairo University, and was awarded the degree of "professor" in 1926 despite the objections of the University Act to grant title to anyone who is less than thirty.
Dr. Ain, "the honorable" Dean of the College in 1936 and was elected to the Deanship of four consecutive times, and was elected in December 1945 as an agent of the University.
Research began, Dr. "Ali honorable" take place in scientific journals, age not more than fifteen years. In the Royal University of King's College London, publishing his first five research on quantum theory, which won two classes for which Ph.D (Doctor of Philosophy) and Dsc. (Doctor of Science).
Dr. supervisor made the first scientific research on finding a measure of the vacuum, where the engineering was vacuum-based theory of the "Oenchen" subject only to the movement of particle moving in a gravitational field.
We have added new theories in the interpretation of radiation from the sun, but the theory of Dr. honorably in the radiation and the speed came back from the most important theories and the cause of his fame and its universality, as demonstrated by Dr. honorable Article radiation in origin and can be considered two for one thing becomes one of the other .. And this theory has paved the world to divert Atomic radiation.
Dr. "to" one of a few who knew the secret break-up of corn and one of scientists who fought to use in the war .. But he was the first new idea is that hydrogen can be made from such a bomb .. But it did not wish to create the bomb Aloidrugenip, which occurred several years after his death in the United States and Russia ..
Research estimates Dr. "Ali honorable" unique in the theories of quantum and atom, radiation, mechanics, dynamics, about fifteen in search .. Has reached drafts of scientific research before his death to about two hundred .. Perhaps the doctor was intended to collect to get the Nobel Prize in Mathematical Sciences.
Was invited by the German scientist Albert Einstein's parent to participate in research relating to the dumping of corn in 1945 Visiting Professor for a year, but he apologized, saying: "In my generation needs to"
D, died. Honorably in the January 16, 1950 poison and conditions became the death of Dr. Supervisor sudden too vague and was all the circumstances surrounding it indicate that he was murdered either by a representative of King Farouk or by the Israeli Mossad and each reason.
May be the monarchy of Egypt at that time a role in the killing, especially if we know that d. Supervisor had assembled the group under the name «Youth Egypt» was a large number of intellectuals, scientists, students and was designed to eliminate system Farouk monarchy and Egypt's declaration of the Republic of independent Arab and news of the order of this Community news secret and went to the Royal Palace, which gives the palace an excuse to get rid of d. Mustafa, The World Zionist suffice it to say that their view of the student genius d. Samira Moussa would not differ from their perception of the teacher is the most gifted d. Mustafa supervisor and played Zionism played a dirty filter and the physical one look at means to get rid of them and their ilk.


Mashad Yahya .. Egyptian nuclear scientist and university professor, he studied in Iraq at the University of Technology Department of Electrical Engineering has witnessed his students and all who knew him, morality, intelligence and science.
Mashad Yahya was born in Egypt in milk in 1932, graduated from the electricity department at the University of Alexandria in 1952, graduated in 1963 Doctoral degree in engineering nuclear reactors from the Soviet Union, where he had obtained a scholarship in 1956.
Upon his return he joined the faculty of nuclear energy Egypt, where he is doing research, he moved to Norway between 1963 and 1964, then returned as an assistant professor, Faculty of Engineering, Alexandria University, and was soon was promoted to "Professor", where he supervised many theses and published more than 50 papers were presented.
After the war in June 1967 have been frozen Egyptian nuclear program, which led to the suspension of nuclear research in the field, and the situation became more difficult for him after the 1973 war with Egypt was diverted energies to other directions.
The signing of Saddam Hussein in the November 18, 1975 nuclear cooperation agreement with France, its impact in attracting scientists insist to Iraq, where they went to work there. The refusal of certain shipments of uranium, France, where he looked in violation of the specification, followed by France insisted on attending in person to France to receive the coordination of uranium.
III was assassinated in June 13, 1980, in room No. 941 Le Meridien Hotel in Paris. And it smashed his skull, authorities had limited the French case against the unknown.
Been accused of being a prostitute he was with France and that his death was against the background of this subject, however, Marie-Claude Madjal Prostitute - (Marie's famous Express) - denied the official version, but it reported that he refused to even talk with them, ignored the story of a prostitute, although they were users major and only in the case of his death and was assassinated after a while. and confirms many of his colleagues that the Mossad was behind the assassination.
In general, ignored most of the Arab media, the official news of his death and some said quickly.
Politics and friendship:
Surprisingly, after the family back brace from Iraq, they'd be doing the funeral of the late, did not attend the funeral of any of the officials or colleagues Faculty of Engineering, but a few numbered, as the Egyptian-Iraqi relations at the time were not so good after the signing of the Camp David. The family and brace coming out of Iraq do not know what to do after the departure of Mashad, but for the pension which the state of Iraq and his behavior was spent on the orders of Saddam Hussein's life (although it stopped after the Gulf War). Pension fraction of Social Affairs, which does not observe the status of the family or the development world at large.
The Egyptian media did not highlight enough the story of the assassination of Mashad, despite their importance. Perhaps the timing of this story is the center of political events, a truck make it less important compared to these events. He remained a brace remained closed, and remained a result of inquiries that the perpetrator is unknown. The brace is one of a series of distinguished Arab scholars who have been liquidated by the Mossad.

Said Mr. Bedier

Killing the world, Said Mr. Bedair, son of the artist by the late Sayed Bedier and had graduated from the Military Technical College and was appointed an officer in the Egyptian armed forces until he reached the rank of colonel and retired colonel, at his request he had received a doctorate from England and then worked in research satellite at the University of the West German and Dear contracted to conduct research for two years
There are reaching young engineer, to surprising results have been published research in all countries in the world, even agree with researchers Americans in October 1988 to conduct research with them, after the end of his contract with the German University and here resented the researchers of the German University and began harassing him and harass him even eliminates the idea of a compact with the Americans
According to his wife that she and her husband and their two sons, they discover during their stay in Germany tampering with the furniture their homes and steal the books of her husband, as a result of feeling worried family decided to return to Egypt, to the pair back to Germany to complete the term of his contract and then returned to Cairo on June 8, 1988 and decided to travel to one of his brothers in Alexandria to complete the research where he was found dead
And his wife confirmed that one of the intelligence behind the assassination of her husband and to affirm that the world is happy monastery reached through advanced research results made it occupies the third level only 13 scientists in the field of specialization is rare in the engineering technology for missile


Samira Moussa
Samira Moussa (March 3, 1917 to August 5, 1952) was born in the village of Great Senbo - Center Zefta Western Province, the first atom scientist nicknamed the Egyptian-Arab and the name of Corey Mays Middle .. It is the first lecturer in the Faculty of Science, Fouad I University, now Cairo University.
Infancy
I learned Samira since childhood to read and write, and preserved parts of the Koran and was fascinated to read newspapers, it enjoys a memory strong position to save the thing as soon as read. And moved her father with his daughter to Cairo to teach and bought some of his money hotel Balhasin even invests in the life of the Cairene. Samira went to school, "Palace of Desire" followed by "primary" school girls supervision "School and is built on the founding and managing the" Prophet Moses "Women's well-known political activist.
Perfect grade
Samira won first prizes in all stages of education, was the first certification guidelines in 1935, did not win the girls in this familiar place at the time where it was not allowed into the examination guidelines of the houses only to change the resolution in 1925 establishing the School of Princess Faiza, first secondary school for girls in Egypt.
Was superior to continued significant impact on the school where the government provides financial aid for the school, which comes out first, pushed the headmistress Prophet Moses to buy a special factory when I heard one day that Samira plans to move to a state school available by the laboratory.
The Nbogha for it had to redraft the government algebra book in the first year of high school, and printed at the expense of her father, and distributed free of charge to colleagues in 1933
Her school
Samira Moussa chose Faculty of Science, although the total was positioned to enter the College of Engineering .. When the security of any girl at that time was to join the Faculty of Arts and there drew the attention of her professor, Dr. Ali supervisor, the first Egyptian Dean of the Faculty of Science. Have been influenced by the directly affected, not only scientifically but also on social aspects of his personality.
Samira Moussa, received a Bachelor of Science and was the first in her class, and appointed the first lecturer, Faculty of Science, thanks to the efforts of Dr. The supervisor, who defended strongly and set to ignore the protests of foreign teachers (English).
I got a Masters degree in the subject of communication warming gases traveled on a mission to Britain, studied by nuclear radiation, and got a doctorate in X-rays and their impact on different materials. Equalization important findings of the completed letter in two years and spent the third year in research related to and received from which to offset significant (not been accepted by the Western world at that time) managed to break cheap metals such as copper and then manufacture the atomic bomb materials may be accessible to everyone, but had not been written scientific books Arabic research findings d. Samira Moussa.
Political interests
It was hoped that Egypt and the Arab world place the middle of this major scientific breakthrough, as it believed that the increased ownership of nuclear weapons would contribute to peace, any State that has adopted the idea of peace must be speaking from a position of strength has lived with the scourge of war and the experiences of the atomic bomb that flattened Hiroshima and Nagasaki in 1945 and drew attention early attention from Israel's possession of weapons of mass destruction and seeking to monopolize nuclear arms in the region.
She founded the Atomic Energy Commission just three months after the announcement of the Israeli state in 1948 and was keen to send missions to specialize in atomic science was its repeated invitations to the importance of nuclear arms, and keep up with the tide of scientific growing and organized the Conference of the Atoms for Peace hosted by the Faculty of Science and attended by a number much of the world's scientists have reached in their search to the equation were not accepted when the Western world, it was - God love her - hoping to harness the atom for the good of humankind and pervades the field of medical treatment where she says: «My wish is that the treatment of cancer with corn, such as aspirin». It also was a member of many scientific committees specialized on top "of the Committee on Energy and the prevention of the atomic bomb, formed by the Egyptian Ministry of Health.
Assassinated
Responded Dr. to an invitation to travel to America in 1951, she had the opportunity to conduct research in the labs University of St. Louis, Missouri, received offers to stay in America but she refused and before returning a few days responded to the invitation to visit nuclear plants on the outskirts of California on August 15, and on the way California high rugged truck suddenly appeared; to hit her car strongly and depositing them in a deep ravine, the driver jumped out and disappeared forever.
The beginning of doubt in the fact that her murder
Investigations showed that the driver was carrying a false name and that the Department did not send one of the reactor to take her, she said in her letters to her father: «If it was in Egypt, such as factory plants located here, I could work the needs of the many». Have commented Mohamed El Zayat Egypt's cultural adviser in Washington at the time that the word (the needs of many) that was meant in its ability to break up the invention of the cheap metal atoms through the thermal conductivity of the gases and then making an atomic bomb, cheap costs.
In her last letter she said: «I was able to visit nuclear plants in America and when I go back to Egypt, I will give to my great services in this field and I'm going to serve the cause of peace», where it was intended to establish a special lab in the pyramids in Giza Governorate
It removed the newspapers on her story and that its case is not closed, though evidence suggests - according to observers - that the Mossad, Israeli intelligence is Aghtaltha, a penalty for attempting to transfer nuclear science to Egypt and the Arab world in the early period.

Nabil Al Kaleny

The story of this very strange world, has disappeared since 1975 until now, this world has been sent by the Faculty of Science at Cairo University to Czechoslovakia to do more research and studies in corn.
Scientific research has uncovered by the IAEA for a great scientific genius I talked about all the Czech newspapers, and then received his doctorate in corn from the University of Prague. On the morning of Monday, 27/01/1975 the phone rang in the apartment where he lived, Dr. Al Kaleny and after the call went out, Dr. no longer so far.
Since the communication was interrupted Dr. with the Faculty of Science, Cairo University, sent the college to the University Czech inquiring about the fate of Dr. Nabil, who was a master modern journalism, the Czech international scientific community, and there were University Czech, and after several messages urgent, Faculty of Science, Cairo University, Czech authorities stated that the world Dr. Al Kaleny out of his home after a phone call did not return to his home.
It is strange that the university learned the news of the Czech and dial-up is where I learned it? Is the Czech contacted the police, if the police told the university administration, the Czech police knew where it is??? But the strange thing is that the Egyptian authorities (in 1975) did not investigate this crime, and constants and the facts of disappearance, we tip the scales in that Dr. was lured into an ambush by the Mossad, then either be killed or subjected to so-called wash brain in order to achieve disable everything in his mind advanced scientific studies and either be in a Western or Israeli prisons or to have been exchanged with some Israeli spies in Egypt after the signing of the treaty "at Camp David."

DNA

(DNA) is a nucleic acid that contains the genetic instructions used in the development and functioning of all known living organisms and some viruses. The main role of DNA molecules is the long-term storage of information. DNA is often compared to a set of blueprints, like a recipe or a code, since it contains the instructions needed to construct other components of cells, such as proteins and RNA molecules. The DNA segments that carry this genetic information are called genes, but other DNA sequences have structural purposes, or are involved in regulating the use of this genetic information.

Chemically, DNA consists of two long polymers of simple units called nucleotides, with backbones made of sugars and phosphate groups joined by ester bonds. These two strands run in opposite directions to each other and are therefore anti-parallel. Attached to each sugar is one of four types of molecules called bases. It is the sequence of these four bases along the backbone that encodes information. This information is read using the genetic code, which specifies the sequence of the amino acids within proteins. The code is read by copying stretches of DNA into the related nucleic acid RNA, in a process called transcription.

Within cells, DNA is organized into long structures called chromosomes. These chromosomes are duplicated before cells divide, in a process called DNA replication. Eukaryotic organisms (animals, plants, fungi, and protists) store most of their DNA inside the cell nucleus and some of their DNA in organelles, such as mitochondria or chloroplasts.[1] In contrast, prokaryotes (bacteria and archaea) store their DNA only in the cytoplasm. Within the chromosomes, chromatin proteins such as histones compact and organize DNA. These compact structures guide the interactions between DNA and other proteins, helping control which parts of the DNA are transcribed.
Properties
DNA is a long polymer made from repeating units called nucleotides.[2][3][4] The DNA chain is 22 to 26 Ångströms wide (2.2 to 2.6 nanometres), and one nucleotide unit is 3.3 Å (0.33 nm) long.[5] Although each individual repeating unit is very small, DNA polymers can be very large molecules containing millions of nucleotides. For instance, the largest human chromosome, chromosome number 1, is approximately 220 million base pairs long.[6]

In living organisms, DNA does not usually exist as a single molecule, but instead as a pair of molecules that are held tightly together.[7][8] These two long strands entwine like vines, in the shape of a double helix. The nucleotide repeats contain both the segment of the backbone of the molecule, which holds the chain together, and a base, which interacts with the other DNA strand in the helix. A base linked to a sugar is called a nucleoside and a base linked to a sugar and one or more phosphate groups is called a nucleotide. If multiple nucleotides are linked together, as in DNA, this polymer is called a polynucleotide.[9]

The backbone of the DNA strand is made from alternating phosphate and sugar residues.[10] The sugar in DNA is 2-deoxyribose, which is a pentose (five-carbon) sugar. The sugars are joined together by phosphate groups that form phosphodiester bonds between the third and fifth carbon atoms of adjacent sugar rings. These asymmetric bonds mean a strand of DNA has a direction. In a double helix the direction of the nucleotides in one strand is opposite to their direction in the other strand: the strands are antiparallel. The asymmetric ends of DNA strands are called the 5′ (five prime) and 3′ (three prime) ends, with the 5' end having a terminal phosphate group and the 3' end a terminal hydroxyl group. One major difference between DNA and RNA is the sugar, with the 2-deoxyribose in DNA being replaced by the alternative pentose sugar ribose in RNA.[8]
A section of DNA. The bases lie horizontally between the two spiraling strands.[11] Animated version at File:DNA orbit animated.gif.

The DNA double helix is stabilized by hydrogen bonds between the bases attached to the two strands. The four bases found in DNA are adenine (abbreviated A), cytosine (C), guanine (G) and thymine (T). These four bases are attached to the sugar/phosphate to form the complete nucleotide, as shown for adenosine monophosphate.

These bases are classified into two types; adenine and guanine are fused five- and six-membered heterocyclic compounds called purines, while cytosine and thymine are six-membered rings called pyrimidines.[8] A fifth pyrimidine base, called uracil (U), usually takes the place of thymine in RNA and differs from thymine by lacking a methyl group on its ring. Uracil is not usually found in DNA, occurring only as a breakdown product of cytosine. In addition to RNA and DNA, a large number of artificial nucleic acid analogues have also been created to study the proprieties of nucleic acids, or for use in biotechnology
Grooves
Twin helical strands form the DNA backbone. Another double helix may be found by tracing the spaces, or grooves, between the strands. These voids are adjacent to the base pairs and may provide a binding site. As the strands are not directly opposite each other, the grooves are unequally sized. One groove, the major groove, is 22 Å wide and the other, the minor groove, is 12 Å wide.[13] The narrowness of the minor groove means that the edges of the bases are more accessible in the major groove. As a result, proteins like transcription factors that can bind to specific sequences in double-stranded DNA usually make contacts to the sides of the bases exposed in the major groove.[14] This situation varies in unusual conformations of DNA within the cell (see below), but the major and minor grooves are always named to reflect the differences in size that would be seen if the DNA is twisted back into the ordinary B form.
Base pairing
Each type of base on one strand forms a bond with just one type of base on the other strand. This is called complementary base pairing. Here, purines form hydrogen bonds to pyrimidines, with A bonding only to T, and C bonding only to G. This arrangement of two nucleotides binding together across the double helix is called a base pair. As hydrogen bonds are not covalent, they can be broken and rejoined relatively easily. The two strands of DNA in a double helix can therefore be pulled apart like a zipper, either by a mechanical force or high temperature.[15] As a result of this complementarity, all the information in the double-stranded sequence of a DNA helix is duplicated on each strand, which is vital in DNA replication. Indeed, this reversible and specific interaction between complementary base pairs is critical for all the functions of DNA in living organisms.
The two types of base pairs form different numbers of hydrogen bonds, AT forming two hydrogen bonds, and GC forming three hydrogen bonds (see figures, left). DNA with high GC-content is more stable than DNA with low GC-content, but contrary to popular belief, this is not due to the extra hydrogen bond of a GC base pair but rather the contribution of stacking interactions (hydrogen bonding merely provides specificity of the pairing, not stability).[16] As a result, it is both the percentage of GC base pairs and the overall length of a DNA double helix that determine the strength of the association between the two strands of DNA. Long DNA helices with a high GC content have stronger-interacting strands, while short helices with high AT content have weaker-interacting strands.[17] In biology, parts of the DNA double helix that need to separate easily, such as the TATAAT Pribnow box in some promoters, tend to have a high AT content, making the strands easier to pull apart.[18] In the laboratory, the strength of this interaction can be measured by finding the temperature required to break the hydrogen bonds, their melting temperature (also called Tm value). When all the base pairs in a DNA double helix melt, the strands separate and exist in solution as two entirely independent molecules. These single-stranded DNA molecules (ssDNA) have no single common shape, but some conformations are more stable than others.
Sense and antisense
A DNA sequence is called "sense" if its sequence is the same as that of a messenger RNA copy that is translated into protein.[20] The sequence on the opposite strand is called the "antisense" sequence. Both sense and antisense sequences can exist on different parts of the same strand of DNA (i.e. both strands contain both sense and antisense sequences). In both prokaryotes and eukaryotes, antisense RNA sequences are produced, but the functions of these RNAs are not entirely clear.[21] One proposal is that antisense RNAs are involved in regulating gene expression through RNA-RNA base pairing.

A few DNA sequences in prokaryotes and eukaryotes, and more in plasmids and viruses, blur the distinction between sense and antisense strands by having overlapping genes.[23] In these cases, some DNA sequences do double duty, encoding one protein when read along one strand, and a second protein when read in the opposite direction along the other strand. In bacteria, this overlap may be involved in the regulation of gene transcription,[24] while in viruses, overlapping genes increase the amount of information that can be encoded within the small viral genome.
Supercoiling
DNA can be twisted like a rope in a process called DNA supercoiling. With DNA in its "relaxed" state, a strand usually circles the axis of the double helix once every 10.4 base pairs, but if the DNA is twisted the strands become more tightly or more loosely wound.[26] If the DNA is twisted in the direction of the helix, this is positive supercoiling, and the bases are held more tightly together. If they are twisted in the opposite direction, this is negative supercoiling, and the bases come apart more easily. In nature, most DNA has slight negative supercoiling that is introduced by enzymes called topoisomerases.[27] These enzymes are also needed to relieve the twisting stresses introduced into DNA strands during processes such as transcription and DNA replication.
Alternate DNA structures
DNA exists in many possible conformations that include A-DNA, B-DNA, and Z-DNA forms, although, only B-DNA and Z-DNA have been directly observed in functional organisms.[10] The conformation that DNA adopts depends on the hydration level, DNA sequence, the amount and direction of supercoiling, chemical modifications of the bases, the type and concentration of metal ions, as well as the presence of polyamines in solution.[29]

The first published reports of A-DNA X-ray diffraction patterns— and also B-DNA used analyses based on Patterson transforms that provided only a limited amount of structural information for oriented fibers of DNA.[30][31] An alternate analysis was then proposed by Wilkins et al., in 1953, for the in vivo B-DNA X-ray diffraction/scattering patterns of highly hydrated DNA fibers in terms of squares of Bessel functions.[32] In the same journal, James D. Watson and Francis Crick presented their molecular modeling analysis of the DNA X-ray diffraction patterns to suggest that the structure was a double-helix.[7]

Although the `B-DNA form' is most common under the conditions found in cells,[33] it is not a well-defined conformation but a family of related DNA conformations[34] that occur at the high hydration levels present in living cells. Their corresponding X-ray diffraction and scattering patterns are characteristic of molecular paracrystals with a significant degree of disorder.[35][36]

Compared to B-DNA, the A-DNA form is a wider right-handed spiral, with a shallow, wide minor groove and a narrower, deeper major groove. The A form occurs under non-physiological conditions in partially dehydrated samples of DNA, while in the cell it may be produced in hybrid pairings of DNA and RNA strands, as well as in enzyme-DNA complexes.[37][38] Segments of DNA where the bases have been chemically modified by methylation may undergo a larger change in conformation and adopt the Z form. Here, the strands turn about the helical axis in a left-handed spiral, the opposite of the more common B form.[39] These unusual structures can be recognized by specific Z-DNA binding proteins and may be involved in the regulation of transcription

الاثنين، 28 يونيو 2010

Human Genome Project


The Human Genome Project (HGP) was an international scientific research project with a primary goal to determine the sequence of chemical base pairs which make up DNA and to identify and map the approximately 20,000–25,000 genes of the human genome from both a physical and functional standpoint.[1]

The project began in 1990 and was initially headed by James D. Watson at the U.S. National Institutes of Health. A working draft of the genome was released in 2000 and a complete one in 2003, with further analysis still being published. A parallel project was conducted outside of government by the Celera Corporation. Most of the government-sponsored sequencing was performed in universities and research centers from the United States, the United Kingdom, Japan, France, Germany, China, India, Canada, and New Zealand. The mapping of human genes is an important step in the development of medicines and other aspects of health care.

While the objective of the Human Genome Project is to understand the genetic makeup of the human species, the project has also focused on several other nonhuman organisms such as E. coli, the fruit fly, and the laboratory mouse. It remains one of the largest single investigational projects in modern science.

The Human Genome Project originally aimed to map the nucleotides contained in a human haploid reference genome (more than three billion). Several groups have announced efforts to extend this to diploid human genomes including the International HapMap Project, Applied Biosystems, Perlegen, dsgdIllumina, JCVI, Personal Genome Project, and Roche-454.

The "genome" of any given individual (except for identical twins and cloned organisms) is unique; mapping "the human genome" involves sequencing multiple variations of each gene. The project did not study the entire DNA found in human cells; some heterochromatic areas (about 8% of the total genome) remain un-sequenced. dg
Project

=== Background ===sdgs The project began with the culmination of several years of work supported by the United States Department of Energy, in particular workshops in 1984 [2] and 1986 and a subsequent initiative of the US Department of Energy.[3] This 1987 report stated boldly, "The ultimate goal of this initiative is to understand the human genome" and "knowledge of the human as necessary to the continuing progress of medicine and other health sciences as knowledge of human anatomy has been for the present state of medicine." Candidate technologies were already being considered for the proposed undertaking at least as early as 1985.[4]

James D. Watson was head of the National Center for Human Genome Research at the National Institutes of Health (NIH) in the United States starting from 1988. Largely due to his disdgdgsagreement with his boss, Bernadine Healy, over the issue of patenting genes, Watson was forced to resign in 1992. He was replaced by Francis Collins in April 1993, and the name of the Center was changed to the National Human Genome Research Institute (NHGRI) in 1997.

The $3-billion project was formally founded in 1990 by the United States Department of Energy and the U.S. National Institutes of Health, and was expected to take 15 years. In addition to the United States, the international consortium comprised geneticists in the United Kingdom, France, Germany, Japan, China, and India.

Due to widespread international cooperation and advances in the field of genomics (especially in sequence analysis), as well as major advances in computing technology, a 'rough draft' of the genome was finished in 2000 (announced jointly by then US president Bill Clinton and the British Prime Minister Tony Blair on June 26, 2000).[5] This first available rough draft assembly of the genome was completed by the UCSC Genome Bioinformatics Group, primarily led by then graduate student Jim Kent. Ongodgdgsing sequencing led to the announcement of the essentially complete genome in April 2003, 2 years earlier than planned.[6] In May 2006, another milestone was passed on the way to completion of the project, when the sequence of the last chromosome was published in the journal Nature.[7]
State of completion

There are multiple definitions of the "complete sequence of the human genome". According to some of these definitions, the genome has already been completely sequenced, and according to other definitions, the genome has yet to be completely sequenced. There have been multiple popular press articles reporting that the genome was "complete." The genome has been completely sequenced using the definition employed by the International Human Genome Project. A graphical history of the human genome project shows that most of the human genome was complete by the end of 2003. However, there are a number of regions of the human genome that can be considered unfinished:

* First, the central regions of each chromosome, known as centromeres, are highly repetitive DNA sequences that are difficult to sequence using current technology. The centromeres are millions (possibly tens of millions) of base pairs long, and for the most part these are entirely un-sequenced.
* Second, the ends of the chromosomes, called telomeres, are also highly repetitive, and for most of the 46 chromosome ends these too are incomplete. It is not known precisely how much sequence remains before the telomeres of each chromosome are reached, but as with the centromeres, current technological restraints are prohibitive.
* Third, there are several loci in each individual's genome that contain members of multigene families that are difficult to disentangle with shotgun sequencing methods – these multigene families often encode proteins important for immune functions.
* Other than these regions, there remain a few dozen gaps scattered around the genome, some of them rather large, but there is hope that all these will be closed in the next couple of years.

In summary: the best estimates of total genome size indicate that about 92.3% of the genome has been completed [2] and it is likely that the centromeres and telomeres will remain un-sequenced until new technology is developed that facilitates their sequencing. Most of the remaining DNA is highly repetitive and unlikely to contain genes, but it cannot be truly known until it is entirely sequenced. Understanding the functions of all the genes and their regulation is far from complete. The roles of junk DNA, the evolution of the genome, the differences between individuals, and many other questions are still the subject of intense interest by laboratories all over the world.
Goals
The sequence of the human DNA is stored in databases available to anyone on the Internet. The U.S. National Center for Biotechnology Information (and sister organizations in Europe and Japan) house the gene sequence in a database known as GenBank, along with sequences of known and hypothetical genes and proteins. Other organizations such as the University of California, Santa Cruz[3], and Ensembl[4] present additional data and annotation and powerful tools for visualizing and searching it. Computer programs have been developed to analyze the data, because the data itself is difficult to interpret without such programs.

The process of identifying the boundaries between genes and other features in raw DNA sequence is called genome annotation and is the domain of bioinformatics. While expert biologists make the best annotators, their work proceeds slowly, and computer programs are increasingly used to meet the high-throughput demands of genome sequencing projects. The best current technologies for annotation make use of statistical models that take advantage of parallels between DNA sequences and human language, using concepts from computer science such as formal grammars.

Another, often overlooked, goal of the HGP is the study of its ethical, legal, and social implications. It is important to research these issues and find the most appropriate solutions before they become large dilemmas whose effect will manifest in the form of major political concerns.[citation needed]

All humans have unique gene sequences. Therefore the data published by the HGP does not represent the exact sequence of each and every individual's genome. It is the combined "reference genome" of a small number of anonymous donors. The HGP genome is a scaffold for future work in identifying differences among individuals. Most of the current effort in identifying differences among individuals involves single-nucleotide polymorphisms and the HapMap.
Interpretations

Key findings of the draft (2001) and complete (2004) genome sequences include[citation needed]

1. There are approx. 24,000 genes in human beings, the same range as in mice and twice that of roundworms. Understanding how these genes express themselves will provide clues to how diseases are caused.[citation needed]

2. Between 1.1% to 1.4% of the genome's sequence codes for proteins

3. The human genome has significantly more segmental duplications (near identical, repeated sections of DNA repeated) than other mammalian genomes. These sections may underlie the creation of new primate-specific genes

4. At the time when the draft sequence was published less than 7% of protein families appeared to be vertebrate specific
How it was accomplished
The Human Genome Project was started in 1989 with the goal of sequencing and identifying all three billion chemical units in the human genetic instruction set, finding the genetic roots of disease and then developing treatments. With the sequence in hand, the next step was to identify the genetic variants that increase the risk for common diseases like cancer and diabetes.

It was far too expensive at that time to think of sequencing patients’ whole genomes. So the National Institutes of Health embraced the idea for a "shortcut", which was to look just at sites on the genome where many people have a variant DNA unit. The theory behind the shortcut was that since the major diseases are common, so too would be the genetic variants that caused them. Natural selection keeps the human genome free of variants that damage health before children are grown, the theory held, but fails against variants that strike later in life, allowing them to become quite common. (In 2002 the National Institutes of Health started a $138 million project called the HapMap to catalog the common variants in European, East Asian and African genomes.)

The genome was broken into smaller pieces; approximately 150,000 base pairs in length. These pieces were then ligated into a type of vector known as "bacterial artificial chromosomes", or BACs, which are derived from bacterial chromosomes which have been genetically engineered. The vectors containing the genes can be inserted into bacteria where they are copied by the bacterial DNA replication machinery. Each of these pieces was then sequenced separately as a small "shotgun" project and then assembled. The larger, 150,000 base pairs go together to create chromosomes. This is known as the "hierarchical shotgun" approach, because the genome is first broken into relatively large chunks, which are then mapped to chromosomes before being selected for sequencing.

Funding came from the US government through the National Institutes of Health in the United States, and the UK charity, the Wellcome Trust, who funded the Sanger Institute (then the Sanger Centre) in Great Britain, as well as numerous other groups from around the world.

Human Genome Project has been called a Mega Project because of the following factors:

1. The human genome has approx. 3.3 billion base-pairs; if the cost of sequencing is US $3 per base-pair, then the approx. cost will be US $10 billion.

2. If the sequence obtained were to be stored in a typed form in books and if each page contains 1000 letters and each book contains 1000 pages, then 3300 such books would be needed to store the complete information.

However, if expressed in computer storage units (3.3 billion base-pairs) x (2 bits per pair) = 825 megabytes of raw data. Which is about the same size of one music CD. If further compressed, this data can be expected to fit in less than 20 Megabytes.
Public versus private approaches

In 1998, a similar, privately funded quest was launched by the American researcher Craig Venter, and his firm Celera Genomics. Venter was a scientist at the NIH during the early 1990s when the project was initiated. The $300,000,000 Celera effort was intended to proceed at a faster pace and at a fraction of the cost of the roughly $3 billion publicly funded project.

Celera used a technique called whole genome shotgun sequencing, employing pairwise end sequencing[8], which had been used to sequence bacterial genomes of up to six million base pairs in length, but not for anything nearly as large as the three billion base pair human genome.

Celera initially announced that it would seek patent protection on "only 200–300" genes, but later amended this to seeking "intellectual property protection" on "fully-characterized important structures" amounting to 100–300 targets. The firm eventually filed preliminary ("place-holder") patent applications on 6,500 whole or partial genes. Celera also promised to publish their findings in accordance with the terms of the 1996 "Bermuda Statement," by releasing new data annually (the HGP released its new data daily), although, unlike the publicly funded project, they would not permit free redistribution or scientific use of the data. The publicly funded competitor UC Santa Cruz was compelled to publish the first draft of the human genome before Celera for this reason. On July 7, 2000, the UCSC Genome Bioinformatics Group released a first working draft on the web. The scientific community downloaded one-half trillion bytes of information from the UCSC genome server in the first 24 hours of free and unrestricted access to the first ever assembled blueprint of our human species.[9].

In March 2000, President Clinton announced that the genome sequence could not be patented, and should be made freely available to all researchers. The statement sent Celera's stock plummeting and dragged down the biotechnology-heavy Nasdaq. The biotechnology sector lost about $50 billion in market capitalization in two days.

Although the working draft was announced in June 2000, it was not until February 2001 that Celera and the HGP scientists published details of their drafts. Special issues of Nature (which published the publicly funded project's scientific paper)[10] and Science (which published Celera's paper[11]) described the methods used to produce the draft sequence and offered analysis of the sequence. These drafts covered about 83% of the genome (90% of the euchromatic regions with 150,000 gaps and the order and orientation of many segments not yet established). In February 2001, at the time of the joint publications, press releases announced that the project had been completed by both groups. Improved drafts were announced in 2003 and 2005, filling in to ≈92% of the sequence currently.

The competition proved to be very good for the project, spurring the public groups to modify their strategy in order to accelerate progress. The rivals at UC Santa Cruz initially agreed to pool their data, but the agreement fell apart when Celera refused to deposit its data in the unrestricted public database GenBank. Celera had incorporated the public data into their genome, but forbade the public effort to use Celera data.

HGP is the most well known of many international genome projects aimed at sequencing the DNA of a specific organism. While the human DNA sequence offers the most tangible benefits, important developments in biology and medicine are predicted as a result of the sequencing of model organisms, including mice, fruit flies, zebrafish, yeast, nematodes, plants, and many microbial organisms and parasites.

In 2004, researchers from the International Human Genome Sequencing Consortium (IHGSC) of the HGP announced a new estimate of 20,000 to 25,000 genes in the human genome.[12] Previously 30,000 to 40,000 had been predicted, while estimates at the start of the project reached up to as high as 2,000,000. The number continues to fluctuate and it is now expected that it will take many years to agree on a precise value for the number of genes in the human genome.
History
In 1976, the genome of the RNA virus Bacteriophage MS2 was the first complete genome to be determined, by Walter Fiers and his team at the University of Ghent (Ghent, Belgium).[13] The idea for the shotgun technique came from the use of an algorithm that combined sequence information from many small fragments of DNA to reconstruct a genome. This technique was pioneered by Frederick Sanger to sequence the genome of the Phage Φ-X174, a virus (bacteriophage) that primarily infects bacteria that was the first fully sequenced genome (DNA-sequence) in 1977.[14] The technique was called shotgun sequencing because the genome was broken into millions of pieces as if it had been blasted with a shotgun. In order to scale up the method, both the sequencing and genome assembly had to be automated, as they were in the 1980s.

Those techniques were shown applicable to sequencing of the first free-living bacterial genome (1.8 million base pairs) of Haemophilus influenzae in 1995 [15] and the first animal genome (~100 Mbp) [16] It involved the use of automated sequencers, longer individual sequences using approximately 500 base pairs at that time. Paired sequences separated by a fixed distance of around 2000 base pairs which were critical elements enabling the development of the first genome assembly programs for reconstruction of large regions of genomes (aka 'contigs').

Three years later, in 1998, the announcement by the newly-formed Celera Genomics that it would scale up the pairwise end sequencing method to the human genome was greeted with skepticism in some circles. The shotgun technique breaks the DNA into fragments of various sizes, ranging from 2,000 to 300,000 base pairs in length, forming what is called a DNA "library". Using an automated DNA sequencer the DNA is read in 800bp lengths from both ends of each fragment. Using a complex genome assembly algorithm and a supercomputer, the pieces are combined and the genome can be reconstructed from the millions of short, 800 base pair fragments. The success of both the public and privately funded effort hinged upon a new, more highly automated capillary DNA sequencing machine, called the Applied Biosystems 3700, that ran the DNA sequences through an extremely fine capillary tube rather than a flat gel. Even more critical was the development of a new, larger-scale genome assembly program, which could handle the 30–50 million sequences that would be required to sequence the entire human genome with this method. At the time, such a program did not exist. One of the first major projects at Celera Genomics was the development of this assembler, which was written in parallel with the construction of a large, highly automated genome sequencing factory. Development of the assembler was led by Brian Ramos. The first version of this assembler was demonstrated in 2000, when the Celera team joined forces with Professor Gerald Rubin to sequence the fruit fly Drosophila melanogaster using the whole-genome shotgun method[17]. At 130 million base pairs, it was at least 10 times larger than any genome previously shotgun assembled. One year later, the Celera team published their assembly of the three billion base pair human genome.

The Human Genome Project was a 13 year old mega project, that was launched in the year 1990 and completed in 2003. This project is closely associated to the branch of biology called Bio-informatics. The human genome project international consortium announced the publication of a draft sequence and analysis of the human genome—the genetic blueprint for the human being. An American company—Celera, led by Craig Venter and the other huge international collaboration of distinguished scientists led by Francis Collins, director, National Human Genome Research Institute, U.S., both published their findings.

This Mega Project is co-ordinated by the U.S. Department of Energy and the National Institute of Health. During the early years of the project, the Wellcome Trust (U.K.) became a major partner, other countries like Japan, Germany, China and France contributed significantly. Already the atlas has revealed some starting facts. The two factors that made this project a success are:

1. Genetic Engineering Techniques, with which it is possible to isolate and clone any segment of DNA.
2. Availability of simple and fast technologies, to determining the DNA sequences.

Being the most complex organisms, human beings were expected to have more than 100,000 genes or combination of DNA that provides commands for every characteristics of the body. Instead their studies show that humans have only 30,000 genes – around the same as mice, three times as many as flies, and only five times more than bacteria. Scientist told that not only are the numbers similar, the genes themselves, baring a few, are alike in mice and men. In a companion volume to the Book of Life, scientists have created a catalogue of 1.4 million single-letter differences, or single-nucleotide polymorphisms (SNPs) – and specified their exact locations in the human genome. This SNP map, the world's largest publicly available catalogue of SNP's, promises to revolutionize both mapping diseases and tracing human history. The sequence information from the consortium has been immediately and freely released to the world, with no restrictions on its use or redistribution. The information is scanned daily by scientists in academia and industry, as well as commercial database companies, providing key information services to bio-technologists. Already, many genes have been identified from the genome sequence, including more than 30 that play a direct role in human diseases. By dating the three millions repeat elements and examining the pattern of interspersed repeats on the Y-chromosome, scientists estimated the relative mutation rates in the X and the Y chromosomes and in the male and the female germ lines. They found that the ratio of mutations in male Vs female is 2:1. Scientists point to several possible reasons for the higher mutation rate in the male germ line, including the fact that there are a greater number of cell divisions involved in the formation of sperm than in the formation of eggs.
Methods

The IHGSC used pair-end sequencing plus whole-genome shotgun mapping of large (≈100 Kbp) plasmid clones and shotgun sequencing of smaller plasmid sub-clones plus a variety of other mapping data to orient and check the assembly of each human chromosome[10].

The Celera group emphasized the importance of the “whole-genome shotgun” sequencing method, relying on sequence information to orient and locate their fragments within the chromosome. However they used the publicly available data from HGP to assist in the assembly and orientation process, raising concerns that the Celera sequence was not independently derived.
Genome donors

In the IHGSC international public-sector Human Genome Project (HGP), researchers collected blood (female) or sperm (male) samples from a large number of donors. Only a few of many collected samples were processed as DNA resources. Thus the donor identities were protected so neither donors nor scientists could know whose DNA was sequenced. DNA clones from many different libraries were used in the overall project, with most of those libraries being created by Dr. Pieter J. de Jong. It has been informally reported, and is well known in the genomics community, that much of the DNA for the public HGP came from a single anonymous male donor from Buffalo, New York (code name RP11).[20]

HGP scientists used white blood cells from the blood of two male and two female donors (randomly selected from 20 of each) -- each donor yielding a separate DNA library. One of these libraries (RP11) was used considerably more than others, due to quality considerations. One minor technical issue is that male samples contain just over half as much DNA from the sex chromosomes (one X chromosome and one Y chromosome) compared to female samples (which contain two X chromosomes). The other 22 chromosomes (the autosomes) are the same for both genders.

Although the main sequencing phase of the HGP has been completed, studies of DNA variation continue in the International HapMap Project, whose goal is to identify patterns of single-nucleotide polymorphism (SNP) groups (called haplotypes, or “haps”). The DNA samples for the HapMap came from a total of 270 individuals: Yoruba people in Ibadan, Nigeria; Japanese people in Tokyo; Han Chinese in Beijing; and the French Centre d’Etude du Polymorphisms Humain (CEf) resource, which consisted of residents of the United States having ancestry from Western and Northern Europe.

In the Celera Genomics private-sector project, DNA from five different individuals were used for sequencing. The lead scientist of Celera Genomics at that time, Craig Venter, later acknowledged (in a public letter to the journal Science) that his DNA was one of 21 samples in the pool, five of which were selected for use[21][22].

On September 4, 2007, a team led by Craig Venter published his complete DNA sequence[23], unveiling the six-billion-nucleotide genome of a single individual for the first time.
Benefits

The work on interpretation of genome data is still in its initial stages. It is anticipated that detailed knowledge of the human genome will provide new avenues for advances in medicine and biotechnology. Clear practical results of the project emerged even before the work was finished. For example, a number of companies, such as Myriad Genetics started offering easy ways to administer genetic tests that can show predisposition to a variety of illnesses, including breast cancer, disorders of hemostasis, cystic fibrosis, liver diseases and many others. Also, the etiologies for cancers, Alzheimer's disease and other areas of clinical interest are considered likely to benefit from genome information and possibly may lead in the long term to significant advances in their management.

There are also many tangible benefits for biological scientists. For example, a researcher investigating a certain form of cancer may have narrowed down his/her search to a particular gene. By visiting the human genome database on the World Wide Web, this researcher can examine what other scientists have written about this gene, including (potentially) the three-dimensional structure of its product, its function(s), its evolutionary relationships to other human genes, or to genes in mice or yeast or fruit flies, possible detrimental mutations, interactions with other genes, body tissues in which this gene is activated, diseases associated with this gene or other datatypes.

Further, deeper understanding of the disease processes at the level of molecular biology may determine new therapeutic procedures. Given the established importance of DNA in molecular biology and its central role in determining the fundamental operation of cellular processes, it is likely that expanded knowledge in this area will facilitate medical advances in numerous areas of clinical interest that may not have been possible without them.

The analysis of similarities between DNA sequences from different organisms is also opening new avenues in the study of evolution. In many cases, evolutionary questions can now be framed in terms of molecular biology; indeed, many major evolutionary milestones (the emergence of the ribosome and organelles, the development of embryos with body plans, the vertebrate immune system) can be related to the molecular level. Many questions about the similarities and differences between humans and our closest relatives (the primates, and indeed the other mammals) are expected to be illuminated by the data from this project.

The Human Genome Diversity Project (HGDP), spinoff research aimed at mapping the DNA that varies between human ethnic groups, which was rumored to have been halted, actually did continue and to date has yielded new conclusions.[citation needed] In the future, HGDP could possibly expose new data in disease surveillance, human development and anthropology. HGDP could unlock secrets behind and create new strategies for managing the vulnerability of ethnic groups to certain diseases (see race in biomedicine). It could also show how human populations have adapted to these vulnerabilities.

Advantages of Human Genome Project:

1. Knowledge of the effects of variation of DNA among individuals can revolutionize the ways to diagnose, treat and even prevent a number of diseases that affects the human beings.
2. It provides clues to the understanding of human biology.

Criticisms

For biologists, the genome has yielded one insightful surprise after another. But the primary goal of the Human Genome Project — to ferret out the genetic roots of common diseases like cancer and Alzheimer’s and then generate treatments — has been largely elusive[24].

One sign of the genome’s limited use for medicine so far was a recent test of genetic predictions for heart disease. A medical team from Brigham and Women’s Hospital in Boston collected 101 genetic variants that had been statistically linked to heart disease in various genome-scanning studies. But the variants turned out to have no value in forecasting disease among 19,000 women who had been followed for 12 years. The old-fashioned method of taking a family history was a better guide.[25]

The pharmaceutical industry has spent billions of dollars to reap genomic secrets and is starting to bring several genome-guided drugs to market. While drug companies continue to pour huge amounts of money into genome research, it has become clear that the genetics of most diseases are more complex than anticipated and that it will take many more years before new treatments may be able to transform medicine.

The last decade has brought a flood of discoveries of disease-causing mutations in the human genome. But with most diseases, the findings have explained only a small part of the risk of getting the disease. And many of the genetic variants linked to diseases, some scientists[who?] have begun to fear, could be statistical illusions.

Using the HapMap catalog of genetic variations, studies were conducted to see if any of the variants were more common in the patients with a given disease than in healthy people. These studies required large numbers of patients and cost several million dollars apiece. Nearly 400 of them had been completed by 2009. These studies revealed that although hundreds of common genetic variants have been statistically linked with various diseases, with most diseases, the common variants have turned out to explain just a fraction of the genetic risk. It now seems more likely that each common disease is mostly caused by large numbers of rare variants, ones too rare to have been cataloged by the HapMap.

Defenders of the HapMap and genome-wide association studies say that the approach made sense because it is only now becoming cheap enough to look for rare variants, and that many common variants do have roles in diseases.

As of June 2010, some 850 sites on the genome, most of them near genes, have been implicated in common diseases. But most of the sites linked with diseases are not in genes — the stretches of DNA that tell the cell to make proteins — and have no known biological function, leading some geneticists[who?] to suspect that the associations are spurious.

Many of them may stem from factors other than a true association with disease risk[26]. The new switch among geneticists to seeing rare variants as the major cause of common disease is a major paradigm shift in human genetics.
Ethical, legal and social issues

The project's goals included not only identifying all of the approximately 24,000 genes in the human genome, but also to address the ethical, legal, and social issues (ELSI) that might arise from the availability of genetic information. Five percent of the annual budget was allocated to address the ELSI arising from the project.

Debra Harry, Executive Director of the U.S group Indigenous Peoples Council on Biocolonialism (IPCB), says that despite a decade of ELSI funding, the burden of genetics education has fallen on the tribes themselves to understand the motives of Human genome project and its potential impacts on their lives. Meanwhile, the government has been busily funding projects studying indigenous groups without any meaningful consultation with the groups. (See Biopiracy.)[27]

The main criticism of ELSI is the failure to address the conditions raised by population-based research, especially with regard to unique processes for group decision-making and cultural worldviews. Genetic variation research such as HGP is group population research, but most ethical guidelines, according to Harry, focus on individual rights instead of group rights. She says the research represents a clash of culture: indigenous people's life revolves around collectivity and group decision making whereas the Western culture promotes individuality. Harry suggests that one of the challenges of ethical research is to include respect for collective review and decision making, while also upholding the Western model of individual rights.

Nuclear physics

Nuclear physics is the field of physics that studies the building blocks and interactions of atomic nuclei. The most commonly known applications of nuclear physics are nuclear power and nuclear weapons, but the research has provided wider applications, including those in medicine (nuclear medicine, magnetic resonance imaging), materials engineering (ion implantation) and archaeology (radiocarbon dating).

The field of particle physics evolved out of nuclear physics and, for this reason, has been included under the same term in earlier times.
History

The discovery of the electron by J. J. Thomson was the first indication that the atom had internal structure. At the turn of the 20th century the accepted model of the atom was J. J. Thomson's "plum pudding" model in which the atom was a large positively charged ball with small negatively charged electrons embedded inside of it. By the turn of the century physicists had also discovered three types of radiation coming from atoms, which they named alpha, beta, and gamma radiation. Experiments in 1911 by Lise Meitner and Otto Hahn, and by James Chadwick in 1914 discovered that the beta decay spectrum was continuous rather than discrete. That is, electrons were ejected from the atom with a range of energies, rather than the discrete amounts of energies that were observed in gamma and alpha decays. This was a problem for nuclear physics at the time, because it indicated that energy was not conserved in these decays.

In 1905, Albert Einstein formulated the idea of mass–energy equivalence. While the work on radioactivity by Becquerel, Pierre and Marie Curie predates this, an explanation of the source of the energy of radioactivity would have to wait for the discovery that the nucleus itself was composed of smaller constituents, the nucleons.
Rutherford's team discovers the nucleus

In 1907 Ernest Rutherford published "Radiation of the α Particle from Radium in passing through Matter"Geiger expanded on this work in a communication to the Royal Society with experiments he and Rutherford had done passing α particles through air, aluminum foil and gold leaf. More work was published in 1909 by Geiger and Marsden and further greatly expanded work was published in 1910 by Geiger,[4] In 1911-2 Rutherford went before the Royal Society to explain the experiments and propound the new theory of the atomic nucleus as we now understand it.

The key experiment behind this announcement happened in 1910 as Ernest Rutherford's team performed a remarkable experiment in which Hans Geiger and Ernest Marsden under his supervision fired alpha particles (helium nuclei) at a thin film of gold foil. The plum pudding model predicted that the alpha particles should come out of the foil with their trajectories being at most slightly bent. Rutherford had the idea to instruct his team to look for something that shocked him to actually observe: a few particles were scattered through large angles, even completely backwards, in some cases. He likened it to firing a bullet at tissue paper and having it bounce off. The discovery, beginning with Rutherford's analysis of the data in 1911, eventually led to the Rutherford model of the atom, in which the atom has a very small, very dense nucleus containing most of its mass, and consisting of heavy positively charged particles with embedded electrons in order to balance out the charge (since the neutron was unknown). As an example, in this model (which is not the modern one) nitrogen-14 consisted of a nucleus with 14 protons and 7 electrons (21 total particles), and the nucleus was surrounded by 7 more orbiting electrons.

The Rutherford model worked quite well until studies of nuclear spin were carried out by Franco Rasetti at the California Institute of Technology in 1929. By 1925 it was known that protons and electrons had a spin of 1/2, and in the Rutherford model of nitrogen-14, 20 of the total 21 nuclear particles should have paired up to cancel each other's spin, and the final odd particle should have left the nucleus with a net spin of 1/2. Rasetti discovered, however, that nitrogen-14 has a spin of 1.
James Chadwick discovers the neutron

In 1932 Chadwick realized that radiation that had been observed by Walther Bothe, Herbert L. Becker, Irène and Frédéric Joliot-Curie was actually due to a neutral particle of about the same mass as the proton, that he called the neutron (following a suggestion about the need for such a particle, by Rutherford). In the same year Dmitri Ivanenko suggested that neutrons were in fact spin 1/2 particles and that the nucleus contained neutrons to explain the mass not due to protons, and that there were no electrons in the nucleus—only protons and neutrons. The neutron spin immediately solved the problem of the spin of nitrogen-14, as the one unpaired proton and one unpaired neutron in this model, each contribute a spin of 1/2 in the same direction, for a final total spin of 1.

With the discovery of the neutron, scientists at last could calculate what fraction of binding energy each nucleus had, from comparing the nuclear mass with that of the protons and neutrons which composed it. Differences between nuclear masses were calculated in this way and—when nuclear reactions were measured—were found to agree with Einstein's calculation of the equivalence of mass and energy to high accuracy (within 1% as of in 1934).
Yukawa's meson postulated to bind nuclei

In 1935 Hideki Yukawa proposed the first significant theory of the strong force to explain how the nucleus holds together. In the Yukawa interaction a virtual particle, later called a meson, mediated a force between all nucleons, including protons and neutrons. This force explained why nuclei did not disintegrate under the influence of proton repulsion, and it also gave an explanation of why the attractive strong force had a more limited range than the electromagnetic repulsion between protons. Later, the discovery of the pi meson showed it to have the properties of Yukawa's particle.

With Yukawa's papers, the modern model of the atom was complete. The center of the atom contains a tight ball of neutrons and protons, which is held together by the strong nuclear force, unless it is too large. Unstable nuclei may undergo alpha decay, in which they emit an energetic helium nucleus, or beta decay, in which they eject an electron (or positron). After one of these decays the resultant nucleus may be left in an excited state, and in this case it decays to its ground state by emitting high energy photons (gamma decay).

The study of the strong and weak nuclear forces (the latter explained by Enrico Fermi via Fermi's interaction in 1934) led physicists to collide nuclei and electrons at ever higher energies. This research became the science of particle physics, the crown jewel of which is the standard model of particle physics which unifies the strong, weak, and electromagnetic forces.
Modern nuclear physics
A heavy nucleus can contain hundreds of nucleons which means that with some approximation it can be treated as a classical system, rather than a quantum-mechanical one. In the resulting liquid-drop model, the nucleus has an energy which arises partly from surface tension and partly from electrical repulsion of the protons. The liquid-drop model is able to reproduce many features of nuclei, including the general trend of binding energy with respect to mass number, as well as the phenomenon of nuclear fission.

Superimposed on this classical picture, however, are quantum-mechanical effects, which can be described using the nuclear shell model, developed in large part by Maria Goeppert-Mayer. Nuclei with certain numbers of neutrons and protons (the magic numbers 2, 8, 20, 50, 82, 126, ...) are particularly stable, because their shells are filled.

Other more complicated models for the nucleus have also been proposed, such as the interacting boson model, in which pairs of neutrons and protons interact as bosons, analogously to Cooper pairs of electrons.

Much of current research in nuclear physics relates to the study of nuclei under extreme conditions such as high spin and excitation energy. Nuclei may also have extreme shapes (similar to that of Rugby balls) or extreme neutron-to-proton ratios. Experimenters can create such nuclei using artificially induced fusion or nucleon transfer reactions, employing ion beams from an accelerator. Beams with even higher energies can be used to create nuclei at very high temperatures, and there are signs that these experiments have produced a phase transition from normal nuclear matter to a new state, the quark-gluon plasma, in which the quarks mingle with one another, rather than being segregated in triplets as they are in neutrons and protons.
Modern topics in nuclear physics
There are 80 elements which have at least one stable isotope (defined as isotopes never observed to decay), and in total there are about 256 such stable isotopes. However, there are thousands more well-characterized isotopes which are unstable. These radioisotopes may be unstable and decay in all timescales ranging from fractions of a second to weeks, years, or many billions of years.

For example, if a nucleus has too few or too many neutrons it may be unstable, and will decay after some period of time. For example, in a process called beta decay a nitrogen-16 atom (7 protons, 9 neutrons) is converted to an oxygen-16 atom (8 protons, 8 neutrons) within a few seconds of being created. In this decay a neutron in the nitrogen nucleus is turned into a proton and an electron and antineutrino, by the weak nuclear force. The element is transmuted to another element in the process, because while it previously had seven protons (which makes it nitrogen) it now has eight (which makes it oxygen).

In alpha decay the radioactive element decays by emitting a helium nucleus (2 protons and 2 neutrons), giving another element, plus helium-4. In many cases this process continues through several steps of this kind, including other types of decays, until a stable element is formed.

In gamma decay, a nucleus decays from an excited state into a lower state by emitting a gamma ray. It is then stable. The element is not changed in the process.

Other more exotic decays are possible (see the main article). For example, in internal conversion decay, the energy from an excited nucleus may be used to eject one of the inner orbital electrons from the atom, in a process which produces high speed electrons, but is not beta decay, and (unlike beta decay) does not transmute one element to another.
Nuclear fusion
When two low mass nuclei come into very close contact with each other it is possible for the strong force to fuse the two together. It takes a great deal of energy to push the nuclei close enough together for the strong or nuclear forces to have an effect, so the process of nuclear fusion can only take place at very high temperatures or high densities. Once the nuclei are close enough together the strong force overcomes their electromagnetic repulsion and squishes them into a new nucleus. A very large amount of energy is released when light nuclei fuse together because the binding energy per nucleon increases with mass number up until nickel-62. Stars like our sun are powered by the fusion of four protons into a helium nucleus, two positrons, and two neutrinos. The uncontrolled fusion of hydrogen into helium is known as thermonuclear runaway. Research to find an economically viable method of using energy from a controlled fusion reaction is currently being undertaken by various research establishments (see JET and ITER).
Nuclear fission
For nuclei heavier than nickel-62 the binding energy per nucleon decreases with the mass number. It is therefore possible for energy to be released if a heavy nucleus breaks apart into two lighter ones. This splitting of atoms is known as nuclear fission.

The process of alpha decay may be thought of as a special type of spontaneous nuclear fission. This process produces a highly asymmetrical fission because the four particles which make up the alpha particle are especially tightly bound to each other, making production of this nucleus in fission particularly likely.

For certain of the heaviest nuclei which produce neutrons on fission, and which also easily absorb neutrons to initiate fission, a self-igniting type of neutron-initiated fission can be obtained, in a so-called chain reaction. (Chain reactions were known in chemistry before physics, and in fact many familiar processes like fires and chemical explosions are chemical chain reactions.) The fission or "nuclear" chain-reaction, using fission-produced neutrons, is the source of energy for nuclear power plants and fission type nuclear bombs such as the two that the United States used against Hiroshima and Nagasaki at the end of World War II. Heavy nuclei such as uranium and thorium may undergo spontaneous fission, but they are much more likely to undergo decay by alpha decay.

For a neutron-initiated chain-reaction to occur, there must be a critical mass of the element present in a certain space under certain conditions (these conditions slow and conserve neutrons for the reactions). There is one known example of a natural nuclear fission reactor, which was active in two regions of Oklo, Gabon, Africa, over 1.5 billion years ago. Measurements of natural neutrino emission have demonstrated that around half of the heat emanating from the Earth's core results from radioactive decay. However, it is not known if any of this results from fission chain-reactions.
Production of heavy elements

According to the theory, as the Universe cooled after the big bang it eventually became possible for particles as we know them to exist. The most common particles created in the big bang which are still easily observable to us today were protons (hydrogen) and electrons (in equal numbers). Some heavier elements were created as the protons collided with each other, but most of the heavy elements we see today were created inside of stars during a series of fusion stages, such as the proton-proton chain, the CNO cycle and the triple-alpha process. Progressively heavier elements are created during the evolution of a star. Since the binding energy per nucleon peaks around iron, energy is only released in fusion processes occurring below this point. Since the creation of heavier nuclei by fusion costs energy, nature resorts to the process of neutron capture. Neutrons (due to their lack of charge) are readily absorbed by a nucleus. The heavy elements are created by either a slow neutron capture process (the so-called s process) or by the rapid, or r process. The s process occurs in thermally pulsing stars (called AGB, or asymptotic giant branch stars) and takes hundreds to thousands of years to reach the heaviest elements of lead and bismuth. The r process is thought to occur in supernova explosions because the conditions of high temperature, high neutron flux and ejected matter are present. These stellar conditions make the successive neutron captures very fast, involving very neutron-rich species which then beta-decay to heavier elements, especially at the so-called waiting points that correspond to more stable nuclides with closed neutron shells (magic numbers). The r process duration is typically in the range of a few seconds.
References

1. ^ Philosophical Magazine (12, p 134-46)
2. ^ Proc. Roy. Soc. July 17, 1908
3. ^ Proc. Roy. Soc. A82 p 495-500
4. ^ Proc. Roy. Soc. Feb. 1, 1910

* Nuclear Physics by Irving Kaplan 2Nd edition, 1962 Addison-Wesley
* General Chemistry by Linus Pauling 1970 Dover Pub. ISBN 0-486-65622-5
* Introductory Nuclear Physics by Kenneth S. Krane Pub. Wiley
* Models of the Atomic Nucleus by N. Cook, Springer Verlag (2006), ISBN 3540285695