Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is satellite remote sensing technology

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

Satellite remote sensing refers to a variety of comprehensive technical systems for observing the earth and celestial bodies from the ground to space. Satellite data can be obtained from remote sensing technology platforms, remote sensing instruments and information can be received, processed and analyzed.

Remote sensing is a method to detect and monitor the physical characteristics of an area by measuring the reflection and emission of radiation at a certain distance from the satellite or aircraft. Remote sensing cameras collect images to help researchers "perceive" things on Earth.

Cameras on satellites and airplanes take pictures of large areas of the earth's surface, much more than they can see when standing on the ground. The ship's sonar system can be used to make images of the sea floor without having to go to the bottom of the sea. Satellite cameras can be used to take pictures of changes in ocean temperature.

One of the specific uses of earth remote sensing images is to map large forest fires from space. In addition, long-range satellites can track clouds to help predict weather or observe erupting volcanoes and help observe sandstorms. They are also used to monitor the development of cities and changes in farmland or forests over the years or decades.

01. How do remote sensing satellites work? Remote sensing satellite is also called earth observation satellite or earth remote sensing satellite. They are used as spy satellites or for environmental monitoring, meteorology and mapping. The most common type is the Earth imaging satellite, which takes satellite images similar to aerial photographs. Some EO satellites can perform remote sensing without forming an image, such as in GNSS radio occultation.

On October 4, 1957, with the launch of the first man-made satellite, Sputnik 1, remote sensing satellites appeared for the first time. It sends back radio signals that scientists use to study the ionosphere.

NASA launched the first American satellite, Explorer 1, on January 31, 1958. Information from the radiation detector led to the discovery of the Earth's Van Allen radiation belt.

On April 1, 1960, as part of NASA's Television Infrared observation Satellite (TIROS) program, the TIROS-1 spacecraft launched back the first television footage of weather patterns taken from space.

The instruments carried by most remote sensing satellites should operate at relatively low altitudes. Heights below 500-600 km are usually avoided because the significant air resistance at low altitudes means that they must re-lift their orbits more frequently.

The Earth observation satellites ERS-1, ERS-2 and Envisat of the European Space Agency and the MetOp spacecraft of the European Organization for the Application of Meteorological satellites all operate at a distance of about 800km. The European Space Agency's Proba-1,Proba-2 and SMOS spacecraft observe the Earth from about 700km. The UAE Earth observation satellites Dubai Satellite-1 and Dubai Satellite-2 are also placed in low Earth orbit and provide satellite images of various parts of the Earth.

To achieve global coverage of a low orbit, it must be a polar orbit, or almost a polar orbit. The orbital period of low orbit is about 100 minutes. The earth rotates around its polar axis, rotating about 25 °between continuous orbits, causing the ground orbit to move westward at a longitude of 25 °. Most remote sensing satellites in polar orbit are in solar synchronous orbit.

Sensors or instruments on satellites use the sun as a source of illumination or provide their own source of illumination to measure reflected energy. Sensors powered by natural energy from the sun are passive sensors. The sensors that provide their own energy are called active sensors.

Passive sensors include different types of radiometers for quantitative measurement of electromagnetic radiation intensity in selected bands, as well as spectrometers, which are instruments designed to detect, measure and examine the content of reflected electromagnetic radiation spectra. Most passive systems used for remote sensing work in the visible, infrared, thermal and microwave parts of the electromagnetic spectrum. They measure land and sea surface temperature, vegetation characteristics, cloud and aerosol properties, and other physical properties.

Most passive sensors cannot penetrate dense clouds, which means they are limited in observation areas covered by frequent dense clouds, such as the tropics.

Active sensors include different types of radio detectors and altimeters, radar sensors and scatterometers. Most active sensors work in the microwave band of the electromagnetic spectrum, enabling them to penetrate the atmosphere under most conditions. These sensors help to measure the vertical profiles of aerosols, forest structures, precipitation and winds, sea topography and ice.

02. What orbit is used by remote sensing satellites? There are three main types of satellite orbits: polar orbit, non-polar low earth orbit and geostationary orbit.

The polar orbiting satellite of the spacecraft in geostationary orbit is located on the orbital plane with an inclination of nearly 90 degrees from the equatorial plane. This tilt enables the satellite to perceive the entire Earth, including the polar regions, thus providing observations from locations that are difficult to reach from the ground. Many polar-orbiting satellites are also thought to be sun-synchronous, which means that satellites pass through the same place at the same solar time every cycle.

Polar orbits can be rising or falling. In the ascending orbit, when the satellite's path crosses the equator, the satellite moves from south to north. In descending orbit, the satellite moves from north to south.

Satellites in non-polar low-Earth orbit are usually located at an altitude of less than 2000 kilometers above the earth's surface. For reference, the orbital altitude of the International Space Station is about 400 kilometers. These orbits do not have global coverage, but cover only part of the latitude.

Geostationary satellites follow the rotation of the earth and move at the same rotation speed. Because of this, to observers on Earth, the satellite seems to be fixed in one place. As a result, these satellites capture the same view of the earth in each observation, covering almost one area continuously.

Data processing, interpretation and analysis of remote sensing data obtained from instruments on satellites need to be processed before they can be used by most researchers and applied science users.

Most of the original NASA Earth observation satellite data are processed in NASA's Scientific Investigator led processing system (SIPS) facility. NASA geoscience data is archived in a discipline-specific distributed activity Archiving Center (DAAC) and is fully public and not restricted by data users.

Most data is stored in hierarchical data format (HDF) or Network Common data format (NetCDF) format. Many data tools are available for subset, conversion, visualization, and export to various other file formats.

Once the data are processed, they can be used in a variety of applications, from agriculture to water resources to health and air quality. A single sensor cannot solve all the research problems in a given application. It is often necessary to use multiple sensors and data products to solve their problems, keeping in mind the limitations of the data provided by different spectral, spatial and temporal resolutions.

Create satellite images many sensors collect data from different spectral wavelengths. For example, OLI band 1 on Landsat 8 collects data of 0.433-0.453 microns, while MODIS band 1 collects data of 0.620-0.670 microns. OLI has nine bands, while MODIS has 36 bands, all of which measure different regions of the electromagnetic spectrum. Bands can be combined to generate data images to show different elements of the surface. Data images are usually used to distinguish the characteristics of the area being studied or to determine the research area.

The true color image shows the earth seen by the human eye. For Landsat 8 OLI true color (red, green, blue [RGB]) images, sensor bands 4 (red), 3 (green), and 2 (blue) are combined.

Other spectral band combinations can be used for specific scientific applications, such as flood monitoring, urbanization delineation and vegetation mapping. For example, using M11, I2, and I1 bands to create a false color visible infrared imaging radiometer kit (VIIRS, on the Somi National Polar Orbital Partnership [Suomi NPP] satellite) images are useful to distinguish burn scars from low vegetation or exposed soil, as well as to expose flood areas.

The fire scar is strongly reflected in band 7 of Landsat, which acquires data in the short-wave infrared range. The fire scar is not visible in the picture, which is a standard true color image.

The fire scar is clearly highlighted in red in the picture, which is a false color infrared image. After image interpretation processes the data into images with different band combinations, these images are helpful to resource management decision-making and disaster assessment. This requires a proper interpretation of the image. There are several getting started strategies:

Understand scale-there are different proportions according to the spatial resolution of the image, and each scale provides different importance features. For example, when tracking a flood, a detailed high-resolution view shows which homes and businesses are surrounded by water. The wider landscape shows which parts of the county or metropolitan area are flooded, perhaps water sources. A broader view will show the entire region-flooded river systems or mountains and valleys that control flow. The hemispheric view shows the motion of the weather system related to the flood.

Look for patterns, shapes, and textures-many features are easily identified by their patterns or shapes. For example, agricultural areas are usually geometric, usually round or rectangular. Straight lines are usually man-made structures, such as roads or canals.

Define Color-when using color distinguishing elements, it is important to understand the band combination used to create an image. True color or natural color images are created using band combinations that replicate what we see with our own eyes when we look down from space. Water absorbs light, so it usually appears black or blue in true color images; sunlight reflected from the surface of the water may make it look gray or silver. Sediments can make water look more brown, while algae can make water look greener. The color of vegetation varies from season to season: it is usually bright green in spring and summer, orange, yellow and tan in autumn, and more brown in winter. The exposed ground usually has some brown shadows, although it depends on the mineral composition of the sediment. Because of the widespread use of concrete, urban areas are usually gray. In true color images, ice and snow are white, but clouds are also white. When you use colors to identify objects or features, you must also use surrounding features to put things in context.

Understanding the area you are observing helps to identify these features. For example, knowing that an area has recently been destroyed by wildfires can help determine why vegetation may look different in remote sensing images.

Quantitative analysis using image classification algorithm can more easily distinguish different types of land cover. Image classification uses the spectral information of a single image pixel. Programs that use image classification algorithms can automatically group pixels into so-called unsupervised classification.

Users can also indicate areas of a known land cover type to "train" a program to group pixels; this is called supervised classification. Maps or images can also be integrated into a geographic information system (GIS) and each pixel is compared with other GIS data, such as census data.

Satellites also often carry a variety of sensors to measure biogeophysical parameters such as sea surface temperature, nitrogen dioxide or other atmospheric pollutants, wind, aerosols and biomass. These parameters can be evaluated by statistical and spectral analysis techniques.

This article comes from the official account of Wechat: new Research (ID:chuxinyanjiu), author: Tang Shi

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report