2013年5月1日 星期三

10 GbE enables real-time remote desktops..

Virtualization trends in commercial computing offer benefits for cost, reliability, and security, but pose a challenge for military operators who need to visualize lossless imagery in real time. 10 GbE technology enables a standard zero client solution for viewing pixel-perfect C4ISR sensor and graphics information with near zero interactive latency.

Industrial computer, gaming platform, Embedded pc

For C4ISR systems, ready access to and sharing of visual information at any operator position can increase situational awareness and mission effectiveness. Operators utilize multiple information sources including computers and camera feeds, as well as high-fidelity radar and sonar imagery. Deterministic real-time interaction with remote computers and sensors is required to shorten decision loops and enable rapid actions.A zero client represents the smallest hardware footprint available for manned positions in a distributed computing environment. Zero clients provide user access to remote computers through a networked remote desktop connection or virtual desktop infrastructure. Utilizing a 10 GbE media network for interconnecting multiple computers, sensors, and clients provides the real-time performance and image quality required for critical visualization operations. The cost of deploying a 10 GbE infrastructure is falling rapidly and 10G/40G has become the baseline for data center server interconnect. Additionally, deploying common multifunction crew-station equipment at all operator positions brings system-level cost and logistics benefits. The following discussion examines the evolution to thinner clients and the path to a real-time service-oriented architecture, in addition to looking at zero client benefits and applications.


Evolution to thinner clients
For military C4ISR, capabilities provided by legacy stovepipe implementations are being consolidated into networked multifunction systems of systems. To accomplish this, open standards and rapidly advancing technologies for service-oriented architectures are being leveraged (Figure 1). For crew-station equipment, this drives an evolution from dedicated high-power workstations toward thinner client equipment at user locations. Computing equipment is being consolidated away from the operators into one or more data centers. This leaves the crew station with a remote connection to system resources, but does not ease the requirement for high-performance access to visual information. 10 GbE provides the client/server connection performance necessary for real-time remote communication.


Industrial computer, gaming platform, Embedded pc
Figure 1: Client/server evolution: Increasing communications bandwidth enables more service-oriented computing and “thinner” clients.




Workstations at operator positions normally run software applications locally and provide dedicated resources for data and graphics processing. Server-based data processing and networked sensor distribution systems have moved much of the application processing away from the operator. This can simplify the job of system administration and maintenance and enables multiple users to access the same capabilities. However, much of the processing for presenting images to operators can be unique to the individual needs for varying roles at each position.
Thin clients can be utilized to provide dedicated graphics and video processing horsepower for user-specific visualization operations such as windowing, rendering, and mixing multiple data and sensor sources. Dedicated local graphics processing power can be important for critical real-time operations or for interfacing to servers without high-performance graphics capabilities. This makes a thin “networked visualization client” a flexible option for multifunction crew stations that must interface with both legacy and newer service-oriented systems.
For commercial computing systems, a major push is underway to move high-performance graphics capability into the data center servers. This can be implemented via dedicated workstations for each crew station, virtualized compute engines with dedicated graphics for each crew station, or completely virtualized environments with networked image distribution. Virtualization provides a means to share CPU and GPU compute cycles between multiple users, gaining efficiency from higher utilization of system hardware resources. However, for mission-critical C4ISR systems, a deterministic Quality-of-Service level for performance, reliability, and security must be maintained.
For systems with both computing and graphics processing located away from the operator, zero clients provide network-attached displays with audio and user input devices (keyboard, mouse, and touch screen). Minimizing size, weight, and power at the operator position brings many benefits, but performance depends on the remote visualization processing capabilities and the communication channel. To match workstation performance, a consistent human-computer interaction latency of less than 50 ms must be provided.
Path to a real-time service-oriented architecture
System architects need a graceful technology insertion path that leverages the benefits of thinner clients (Figure 2). One approach for centralizing computing equipment while maintaining performance is to simply move the workstations to the data center and extend the interfaces to the display and input devices. This maintains the dedicated for critical operations. Video and device interface extension can be accomplished via extenders or switch matrices to provide connections between operators and computers.


Industrial computer, gaming platform, Embedded pc
Figure 2: Crew-station evolution to a service-oriented architecture




A more flexible approach is to utilize a standard network to support highly configurable access to all workstation resources from any operator position. With this approach, any user can connect to any image source and user screens can be shared with collaborative remote displays or other users. This also enables growth to a service-oriented “cloud” architecture that follows the trend for general-purpose IT and data processing systems. However, commercial IT products do not always meet the performance, reliability, security, or logistics requirements for mission-critical C4ISR systems.
To leverage this computing trend for real-time applications, a standard 10 GbE media network can be utilized to connect multiple zero clients to multiple remote graphics and sensor sources. Lossless distribution is supported for high-quality text, dynamic 2D/3D graphics, HD video, radar, and sonar imagery. Compositing multiple sources onto a single screen can be performed at the zero client or by networked video processing services. Near-zero latency interaction and video distribution are now possible and support deterministic performance and real-time dynamic visualization at any operator position.
One full-resolution (1,920 x 1,200) loss-less channel at 60 Hz with 24-bit color requires 3.3 Gbps of bandwidth. Therefore, one 10 GbE connection can support a dual-head crew station at full frame rate with audio and USB support. However, many visual applications require no more than a 30 Hz update rate (including 1,080p/30 HD full motion video), which reduces the bandwidth to 1.7 Gbps per channel. This enables triple-head crew stations with audio and USB support over a single 10 GbE connection. Dual Ethernet ports at the zero client can also be provided to support more video channels, higher frame rates, and/or redundant connections.
Zero client benefits
Compared to workstations, zero clients provide several benefits, including lower TCO, reduced SWaP, higher system availability, and more system security and agility.
Reduced total cost of ownership
Zero clients provide the smallest, simplest, and most maintainable equipment available for the operator position. This means lower initial investment costs as well as lower operating and maintenance costs throughout the system life cycle. System modularity and standard interfaces support seamless technology refresh as new computing and display equipment becomes available. 10 GbE has been widely adopted for data centers and standard component costs are declining rapidly. When compared to legacy stovepipe systems, networked systems also greatly reduce the amount of dedicated cabling required.
Reduced size, weight, and power
Only video, audio, and USB encoding/decoding functions are required with a zero client. These are packaged as small dongles or integrated into the display. Small packaging enables new options for lightweight operator consoles with increased ergonomics, as well as reducing noise and the burden on cooling systems for manned areas.
High system availability
System uptime and reliability benefit from consolidating all computing elements into managed data centers. Common equipment at multiple operator positions and redundant network connections support rapid recovery from computer, client, or network equipment failures.
High system security
Security risks are reduced through centralized administration and access authentication at the data center. Additionally, stateless zero client equipment outside the data center and encrypted communications between all components assure system confidentiality and integrity.
System agility
Systems using common crew-station equipment can be reconfigured by software for different mission roles and objectives. Additional clients can be added quickly to extend the system. Also, as computing systems evolve with new virtual desktop infrastructures, today’s investment in zero client equipment is preserved through standard interfaces for video, audio, and user input devices including DVI, PC audio, and USB.
Applications of a zero client
In addition to the benefits of a zero client, the technology’s agility also enables a range of applications using common equipment. For example, remote crew stations can now be smaller, lighter, and more versatile, and operator equipment can be located at remote locations not previously possible. Noisy, heat-generating computing equipment can be moved away from operator positions.
Another application highly suited to zero client utilization is the multifunction crew station. Common crew-station equipment can be used to access multiple computers and sensor sources under secure software control. This supports the capability for dynamic access to multiple systems from a single location. Systems can be rapidly reconfigured for different mission objectives, operating roles, or failure recovery.
Collaborative and remote displays also benefit from zero client usage. Unmanned displays can be attached to the network for sharing real-time visual information for dissemination and collaboration. Large area displays for several viewers can receive multiple feeds with full performance. Additionally, selected sources can be compressed and transmitted through secure routers for wider area distribution.
Using zero client technology for networked multifunction crew stations enables the integration of legacy capabilities into a consolidated operating environment as well as the development of new concepts of operation. One example of this is Barco’s zero client technology, which brings the benefits of state-of-the-art computing architectures into mission-critical C4ISR systems involving advanced visualization.
Mission-critical solution
Leveraging commercial computing trends and standards provides significant cost and capability benefits. However, the level of real-time performance, mission assurance, and information assurance required for mission-critical C4ISR systems must be achieved. Zero client technology enabled by 10 GbE provides the necessary pixel-perfect viewing of graphics and sensor information for these demanding applications.
.....



沒有留言:

張貼留言