time:2021-08-23 source:ZJ Lighting Views:191
The large-screen splicing industry can be described as a hundred flowers blooming. As distributed splicing systems become more popular in the industry in 2014, various manufacturers have spared no effort to introduce various types of distributed splicing processors. Even some manufacturers who have just come into contact with distributed processors do not particularly understand the concept of distributed splicing processors or are technically limited, so they will divide distributed splicing processors into pure hardware, embedded, and PC architecture. In fact, this classification is purely a way to mislead customers. The so-called distributed technology must be a node-style independent processing architecture. It has no distance limitation in application and does not depend on the system. In the process of building a system constructed by distributed splicing processors, not only hardware nodes will be used for encoding and decoding, but also embedded servers may be used to control the entire system. Embedded servers are mainly to ensure the control system Virus-free, safe and stable. The PC architecture mentioned is actually an extension of the traditional centralized, not a distributed splicing system in a fundamental sense. So it cannot be a kind of distributed splicing processor.
Then, in the face of the continuous emergence of various distributed splicing processor manufacturers, how should users choose to be able to:
First: product technology maturity
It has only been a few years since the distributed splicing processor was officially launched on the market. Therefore, only through early and long-term research and development and technology accumulation, the product will mature day by day. Its high-tech threshold lies in its algorithm and coding technology that discourages many manufacturers.
For example, the compression codec technology of the distributed splicing processor in video transmission has gone through three stages:
In the first stage, due to the limitations of coding technology, the video transmission adopts a completely uncompressed mode to transmit the original code stream. This method requires a lot of bandwidth, and a single-channel 1080P@60 frame video requires up to about 3G of bandwidth.
In the second stage, with the changes in application requirements, the uncompressed bitstream method has been unable to meet a large number of video transmissions. Therefore, a hybrid stream compression method with its own algorithm added on the basis of traditional compression (such as MPEG2, MPEG4, etc.) was born. At this time, a single channel of 1080P@60 frames of video is compressed and only requires a few hundred megabytes of bandwidth for transmission. There is no doubt that it has basically solved a certain demand for system expansion within a certain period of time.
In the third stage, with the birth of big data and the needs of social development, the demand for small bandwidth for massive signal access is becoming increasingly urgent. The H.264 international standard compression coding algorithm is the best compression algorithm to solve the small bandwidth. It not only realizes the small bandwidth (as low as tens of megabytes or even several megabytes) to support the system's massive signal expansion, but also guarantees the high-definition restoration of the original picture quality. The advantage of the H.264 algorithm makes its decoding chip also widely used in satellite HD set-top boxes (such as the United States and Europe), and it has now become the standard configuration of the SOC chip for high-definition set-top boxes.
Second: the stability of the system
There is no doubt that the stability of the system is one of the core elements to judge the quality of processor products. The standards of stability evaluation are broad, but usually include three aspects:
1. Synchronization. The large-screen splicing processor mainly processes the connected signals and realizes the on-screen display. Therefore, the synchronization requirements for the on-screen signals are extremely strict, which is mainly reflected in the fact that no freezes can be generated on the video, otherwise It may cause tearing of the image on the screen, which will seriously affect the use effect.
2. Real-time performance. The core application of the large-screen splicing system is to process the video signal transmitted in real time, and be able to display it on the screen in real time so as to provide timely, fast and effective command and decision-making basis. This provides an extremely important condition for the timely deployment and decision-making of major accidents to recover losses. Just as in the event of a major disaster such as an earthquake, fire, landslide, etc., the monitoring party can grasp the disaster situation and deploy real-time disaster relief at the first time when the disaster occurs. It can be said that time is the safety of life and property.
3. The system will not crash as a whole. For large-screen systems, if the entire system fails to run due to a failure of some modules, this means a catastrophe for users. A true distributed system uses distributed splicing node processing, which only processes its own signals and has no effect on other nodes in the system. When a single node fails, the system has no impact and can maintain normal operation. Only by replacing the damaged node, the access signal can be arbitrarily scheduled and processed immediately.
Third: System compatibility
The equipment connected to the large-screen splicing system is often required to be diversified. Therefore, a good distributed splicing processor must have the principle of openness. It should not only support the access of almost all types of signals, but also support the docking of other service platforms, otherwise it will easily cause the system to crash due to compatibility issues and fail to reflect the superiority of the distributed splicing processor over the traditional processor.
For example, in the large-screen splicing system, the most connected devices are cameras, even thousands. However, in many current systems, due to the expansion of signal access and equipment updates in different periods, the brands of cameras are also ever-changing. The problem of the private protocol of the cameras of various manufacturers has caused incompatibility between each other and affected the stability of the entire system.
For another example, some external platforms are required for system upgrades, which require the large-screen splicing system to have an excellent compatibility, otherwise it will be difficult to maintain the overall stable operation of the system. Therefore, when choosing a distributed splicing processor, customers should first understand whether it has good openness, whether it can be connected to different platforms, and compatible with different devices. These are all standards of measurement.