Some may confuse orbital velocity with gravitational escape.
If you launch straight up it does not matter how fast you go, as long as thrust is greater than gravity, which diminishes with distance. At some point Earth's gravity is negligible and you have escaped.
Achieving orbit and staying in orbit balistically requires a velocity,. I expect a kinetic energy 0.5mv^2 of less than the gravitational energy at the desired orbit. Which I assume is why orbits decay over time. In orbit you are always falling back to Earth.
Orbits decay over time because of atmospheric friction.
The atmosphere doesn't have a strict upper limit, it just gets thinner and thinner until it becomes indistinguishable from interplanetary space and continuously looses (a small amount of) matter at the "top" where some of the molecules achieve escape velocity.
At near earth orbit altitudes, there may be a vacuum by our standards, but there's still enough left for friction to be a significant factor on the timescale of years.
Air adds to decay, but if KE is less that gravitational energy I assume the orbit decays regardless of friction.
If kinetic energy is less than gravitational energy, than the sattelite does not have orbital velocity.
IF the earth were a point mass with no atmosphere, that wouldn't be so bad for the satellite - it would still assume an eccentric orbit with its current position at the apigee, that is, as its trajectory curves inward towards the earth, it gains kinetic energy from gravitational acceleration until the point where it reaches and exceeds orbital velocity and thus start to move out again, but since it doesn't have escape velocity, it'll be orbiting in an ellipse.
Fortunately for us and unfortunately for the satellite, the earth is not a point mass and does have an atmosphere, so such an orbit is likely to bring the satellite into an unhealthy proximity of 0.0 meters with the latter or even the planet's surface.
At the distance of comm satellites gravity is so low that the rate of fall back to Earth is small.
At the distance of satellites in low earth orbit (which is not a precisely defined altitude but a rather arbitrary range), gravity is anywhere between 60 and >90% of what it is at the surface. The International space station at an altitude of 408km experiences
about 88% of it, a sattelite at 2000 km above ground (the conventional limit for what we call "LEO" still
about 58%, and even a
geostationary satellite 2.3%.
Even pretending that that acceleration remained constant during the course of the fall, that is calculating a free fall over 35,800 km with a constant acceleration equal that at a geosynchronous orbit, an object dropped from there that is stationary relative to the earth would hit ground in less than 5 hours.
Fire a bullet ~and drop a bullet at the same time and both will hit the ground at the same time.
If I shoot a bullet tangent on the surface ill will fall in a parabolic trajectory regardless of velocity or air resistance.
(I took the liberty two group these two statements as they make more sense together)
Actually the two bullets won't hit the ground at the exact same time. Ignoring air friction, a bullet dropped from an altitude of 1.5 meters will hit the ground after 553 milliseconds (you can use this
handy calculator). Again ignoring air friction and assuming a perfectly even ground and a muzzle velocity of 1000m/s (approximately what you get for a high-end rifle), the bullet will have travelled 553 meters horizontally in that time. Due to the earth's curvature, the ground is 2.4 cm "lower" there than it is at you position. There are probably more elegant ways to calculate this, but I used
cosine(arctan(553m/6371000m))) *6371000m -- that is, the difference between your position's distance from the center of the earth and the bullet's position at 553m from you if it were unaffected by gravity and did fly on a strictly horizontal trajectory, derived via the cosine of the angle at the center of the earth of a triangle with the center, your position, and the target position as the corners.
It takes the bullet 4.4 milliseconds to traverse that extra distance (of course in reality this is small enough for noise from air drift, misalignment of the rifle by a few arcseconds, and bumps in even the flattest to drown the signal). That's how much it's take the bullet you shot longer. That doesn't seem like a lot, but that's only because the bullet is so fucking
slow compared to the speeds we need for orbit. So no, the trajectory's shape is
not independent of velocity.
If you could fire the bullet with a muzzle velocity of 7.9km/s (again ignoring air friction and, unless you're at the pole (or equator and shooting due east or west), the rotation of the earth), which is the orbital velocity at our distance of the center of the earth, it would hit you in the back of the head after about 1 and a half hours 5060 seconds (slightly more when shooting west at the equator as the bullet has to catch up with you, slightly less when shooting east at the equator).