Talking about: those terms in the field of digital surveillance

A

AAC -

AAC's full name is Advanced Audio Coding, which means high-level audio coding. It is an audio standard format set by the International Organization for Standardization (ISO) and is part of the MPEG specification. AAC is the audio file format specified by i-mode. Compared with MP3 format, AAC has absolute advantages in original sound quality and compression efficiency. AAC is the core specification of MPEG4 and 3GPP and is also part of the MPEG specification. AAC can realistically present stereo high-quality original sound, approaching CD sound quality, and can achieve high compression rates, which can help reduce a lot of storage space, and files are much smaller than MP3 format.

AVS -

AVS (Advanced Audio-Video Coding/Decoding Standard) is the English abbreviation of the digital audio and video codec technology standard. The AVS standard includes four major technical standards such as system, video, audio, and digital rights management, and supporting standards such as conformance testing. Its core is to compress digital video and audio data to a few tens or even less than one-hundredth of the original, trying to solve the digital audio and video massive data compression problem, it is also known as digital audio and video encoding and decoding technology. It is the premise of the digital information transmission, storage, playback, etc. Therefore, AVS has become the common basic standard of the digital audio and video industry.

C

Cam -

Cam, also called homecam or webcam, refers to a webcam that can be installed on a computer. With the camera, you can deliver dynamic effects in real time. Users can play video through the camera or broadcast some images live. Sites that use video cameras for video broadcast usually use java applet programs to start live video. There are currently many websites that offer live video. Initially this kind of video broadcast was mainly used for industrial monitoring or security. Current video sites offer live streaming of porn videos. In short, the use of cameras can facilitate the communication between computer users more conveniently and bring them closer together.

CDN -

The full name of the network accelerator CDN is ContentDeliveryNetwork, the content distribution network. Its principle is to publish the content of the website to the cache (cache) server closest to the user, so that most customers access the cache server to obtain the required content, solve the Internet network congestion, and improve the response speed of the user to visit the website. It is as if there are multiple cloning sites distributed around.

D

DSP chip -

DSP chip, also called digital signal processor, is a microprocessor with a special structure. The Harvard architecture of the DSP chip uses a separate program and data. It has dedicated hardware multipliers, extensive pipeline operations, and special DSP instructions that can be used to quickly implement various digital signal processing algorithms. According to the requirements of digital signal processing, DSP chips generally have the following main features:

(1) One multiplication and one addition can be completed in one instruction cycle.

(2) Program and data space are separated, and instructions and data can be accessed at the same time.

(3) On-chip has fast RAM, which can usually be accessed simultaneously in two blocks via an independent data bus.

(4) Hardware support with low overhead or no overhead cycles and jumps.

(5) Fast interrupt handling and hardware I/O support.

DLNA Standards Working Group -

DLNA was established on June 24, 2003. Its predecessor was DHWG. At present, there are more than 700 companies in the world. Its core members are Sony, Canon, Samsung, LG, TCL (Thomson), Nokia, Panasonic, Siemens, IBM, and Hewlett-Packard. 19 companies such as Intel and Intel, and members of China include Founder, Lenovo, Hisense, Tongfang, and Huawei.

The organization aims to establish an interoperable platform based on open industry standards and will establish technical design rules for companies to develop digital home related products. Its goal is to develop guidelines and specifications for media formats, transmission and protocol interoperability based on open industry standards, liaise with other industry standardization organizations, provide interoperability testing, and develop and implement digital home market plans.

DRM -

DRM: DigitalRightManagement. Generally translated into digital copyright protection, but many experts believe that translation is more appropriate for "digital rights protection."

DRM technology determines whether a user has the right to use the content by encrypting digital content and additional information to ensure that the content is only available to those who have already been authorized. The basic principle of DRM is simple, flexible and open.

The biggest challenge for DRM in the future lies in interoperability.

F

Composite video -

Composite video, also known as baseband video or RCA video, is a conventional image data transmission method for the National Television Systems Committee (NTSC) television signal, which transmits data in analog waveforms. The composite video contains chromatic aberration (hue and saturation) and luminance (brightness) information and synchronizes them in a blanking pulse, which is transmitted using the same signal.

In fast-scanning NTSC televisions, the VHF or UHF carrier is the adjusted amplitude used by composite video, which results in a signal that is approximately 6 MHz wide. Some CCTV systems use coaxial cables to transmit composite video at close range. Some DVD players and video tape recorders (VCRs) provide composite video input and output through pickup sockets. This socket is also called RCA connector.

In composite video, the interference of color difference and brightness information is unavoidable, especially when the signal is weak. This is why the NTFS TV stations using VHF or UHF remotely use old whip antennas. The “bunny ears” or the “sky” of the world often contain fake or shaken colors.

H

H.264 -

In 1995, ITU-T's VideoCoding Experts Group (VCEG) set two new goals after completing H.263: A short-term goal is to add some new features to H.263 (the result is H.263version2) Another long-term goal is to develop a new low-rate standard. The result of this long-term goal has been the draft H.26L. H.26L provides better video compression than H.263.

In 2001, ISO's MotionPictureExpertsGroup (MPEG) saw the advanced nature of H.26L and established JointVideoTream (JVT), which includes MPEG and VCEG experts. The main task of JVT is to develop the H.26L draft into a complete standard. The result is two identical standards: ISOMPEG4 Part10 and ITU-TH.264, whose name is Advanced Video Coding (AVC). (Overview of H.264 from vcodex)

VSSH.264Codec is a free download and is a filter that conforms to the dshow structure. It claims to be fast.

H.264/AVC Software Coordination has source code, and there is encoder, decoder Sentivision H.264Decoder seems to be a small Japanese company. The H.264 decoder was implemented on TI's DM642 DSP, and based on this, a Linux-based STB was implemented. This STB also supports WMV9 and MPEG4SP/ASP. There is also a comparison of H.264 and MPEG4.

Moonlight has H.264 player, encoder, sdk

H.26L -

There are two formal organizations developing video coding standards. One is ITU-T, and the other is ISO/IEC JTC1. The ITU-T video coding standard is called a proposal and is expressed in the form of H.26X (for example, H.261, H.262, H.263, and H.26L). ISO/IEC standards are expressed in the form of MPEG-x (for example, MPEG-1, MPEG-2, and MPEG-4).

The ITU-T recommendations are designed for real-time video communications applications such as videoconferencing and video telephony, while the MPEG standards are mainly for video storage (DVD), broadcast video (broadcast TV), and video streaming (for example, online video, DSL, Video and wireless video applications). The two standards committees usually work independently. The only exception is the cooperation between the two developed the H.262/MPEG-2 standard.

The purpose of developing the H.26L project is to develop a simple, straightforward, high-performance video coding standard based on a common module by adopting the "Back-To-Basics" method. The development of the H.26L standard was initiated by the ITU-T Video Coding Experts Group (VCEG) and began in 1997. From a performance point of view, H.26L surpasses all existing video coding standards. By the end of 2001, they discovered that the video quality that H.26L-based software can provide is comparable to the best existing MPEG-4 based software. Thus, ISO/IEC MPEG and ITU-TVCEG combined to form a Joint Video Development Team (JVT) to take over the H.26L project. JVT hopes to establish a unique video coding standard and at the same time make it a new member of the MPEG-4 family of standards and the ITU-T recommendation family (eg, becoming part 10 of MPEG-4 or H.264 in ITU-T). .

J

Baseband video signal -

Composite video signals, also known as baseband video signals or RCA video signals, use NTSC television signals to transmit image data. The composite video signal contains chrominance (color and saturation) and luminance information and forms a single signal together with the sound and picture synchronization information and the blanking signal pulse. In fast scanning NTSC televisions, high-frequency (VHF) and ultra-high-frequency (UHF) carriers are amplitude-modulated by the composite video signal. This will produce a 6MHZ bandwidth signal. Some closed-circuit television systems transmit composite video signals over short coaxial cables. Some DVD players and Video Cassette Recorders (VCRs) regulate the input and output of composite video signals through shielded cable outlets, ie, RCA connectors. In composite video signals, signal interference between chrominance and brightness is unavoidable. The weaker the signal, the more severe the interference.

JVT -

The full English name is JointVideoTeam, a video joint working group. Established in December 2001 in Pattaya, Thailand. It is composed of experts of video coding between the ITU-T and ISO two international standardization organizations. The goal of JVT is to develop a new video coding standard to achieve high video compression ratio, high image quality, and good network adaptability. At present, the work of JVT has been accepted by ITU-T. The new video compression coding standard is called H.264 standard. This standard is also accepted by ISO. It is called AVC (Advanced Video Coding) standard and is the 10th part of MPEG-4.

Stress Analysis Gauges

Stress Analysis Gauges,Axial Strain And Stress Changes,Strain Gauge Type Sensor,Cb Series Strain Gauge

Zhejiang Nanhua Electronic Technology Co., Ltd , https://www.nhloadcells.com