How stars are formed

  • content
  • comment
  • relevant

At first, a large amount of gas (mostly hydrogen) was attracted by its own gravity, and began to collapse towards itself to form stars. As it shrinks, gas atoms collide with each other more and more frequently at a greater speed - the temperature of the gas rises.

Gases become so hot that when hydrogen atoms collide, they no longer spring apart but converge to form helium.

Like a controlled hydrogen bomb, the heat released in the reaction makes the star glow. This additional heat increases the pressure of the gas until it is enough to balance the attraction of gravity, when the gas stops shrinking.

The balance of heat emitted from nuclear reactions and gravitational attraction enables stars to maintain this balance for a long time. However, the star will eventually run out of hydrogen and other nuclear fuel.

It seems to be a big fallacy, but in fact, the more initial fuel a star has, the faster it will burn up. This is because the greater the mass of a star, the hotter it must be to resist gravity. The hotter it gets, the faster its fuel is consumed. Our sun is probably enough to burn for more than 5 billion years, but the stars with higher mass can use up their fuel in a short time of 100 million years, which is much shorter than the age of the universe. When the star runs out of fuel, it starts to cool and shrink. What followed was first understood only in the late 1920s

Chandraseka worked out how big a star could still resist its own gravity and maintain itself after running out of fuel. The idea is that when the star is smaller, the material particles are very close to each other, and according to the Pauli exclusion principle, they must have very different speeds. This causes them to diverge from each other and attempt to expand the star. Therefore, a star can reach the balance due to gravitational attraction and exclusion caused by the principle of incompatibility, while keeping its radius unchanged, just as its gravity was thermally balanced in the early life.

However, Chandraseka realized that there was a limit to the repulsive force that the exclusion principle could provide. Relativity limits the maximum velocity difference of particles in stars to the speed of light. This means that when stars become dense enough, the repulsion force caused by the incompatibility principle will be smaller than the role of gravity. Chandraseka calculated that a cold star with a mass greater than about one and a half times the mass of the sun cannot sustain itself against its own gravity. (This mass is now called the Chandraseka limit.) Soviet scientist Lev Davidovich Landau made similar discoveries at about the same time.

comment

zero Comments

Post reply

Your email address will not be disclosed. Required items have been used * tagging