About Me

My photo
MILITARY TECHNOLOGY (MILTECH) is the world's leading international tri-service defence monthly magazine in the English language. MILITARY TECHNOLOGY is "Required Reading for Defence Professionals". Follow us on Twitter: MILTECH1

30 November 2015

I/ITSEC 2015: AEgis Features Visual-Eyes 3D App and Technology Demos

AEgis Technologies showcases its array of training and simulation capabilities at booth 1901, including:

  • AEgis shows a Augmented Reality (AR) and Virtual Reality (VR) demonstration it specifically designed and created for Google Cardboard technology. The Visual-Eyes app can be downloaded from the Apple or Android app stores for an advance look at the Virtual Training Garage and Interactive Augmented Reality Engagements. With Visual-Eyes, users can immerse themselves in a 3D world by entering a virtual training garage and navigating 360° around three military vehicles: AH-64, F-15E or M3A3. The augmented reality experience allows users to defend their position by firing at either a tank or an unmanned air system (drone). The Visual-Eyes app was created to give users a tangible experience in how virtual and augmented reality is both accessible and ideal for: education, maintenance, real-time training and, in all honesty, just plain fun.
  • The AEgis booth also showcases MONARCH, a flexible powerful interoperability solution, as part of the Operation Blended Warrior (OBW) exercise. MONARCH is a programmable message translation and routing application that enables communications between simulation and other systems in training and operational environments. There will also be live demos on the new VAMPIRE® Pro trainer for UAS; Synthetic Environment Core (SE Core) Common Moving 3D Models; and Geospatial Programs (including Games for Training, Interactive Multimedia Instruction, Terrain Database Development, and Real-time 3D Models).
  • For one week only, AEgis Elements is offering Free 3D Model Downloads during the I/ITSEC conference dates of 30 November to 3 December. There are hundreds of high-fidelity, 3D Models to choose from with  applications that include gaming, simulators, demos, STEM-based education programs and more. Free model downloads plus custom 2D and 3D quotes are also available. The I/ITSEC Sale also includes a bundled package featuring 50 models for $5,000 for one week only.


I/ITSEC 2015: TerraSim Unveils Xtract for VBS3

At I/ITSEC 2015 (co-located with Bohemia Interactive Simulations [BISim] at booth 2248), TerraSim, a BISim company, showcases new and improved functionality in TerraTools 5.1, demonstrates upcoming product innovations, and unveils a new service for generating correlated terrain content from existing VBS environments.

TerraTools 5.1, the latest version of TerraSim's flagship terrain database generation software, contains over 300 new features, improvements, and bug fixes. Major highlights of TerraTools 5.1 include: New CDB Import plug-in; improved Batch Mode Manager custom cell selection and logging; improved VBS3 support for trench generation, multimap optimisations, and terrain packing speed improvements; and improved exporter support for Unity, JCATS, CTDB, X-Plane, and Steel Beasts Pro. TerraTools 5.1 features a new plug-in that imports geospatial data and model content from the Common Database (CDB) format into TerraTools. This content can then be processed and exported to one or more TerraTools supported terrain formats.

Key features include:
  • Supports CDB version 2.1, 3.0, and 3.2
  • Preserves geospatial source data and model attribution during import
  • Supports sub-region import based on user-defined bounds
  • Supports static CDB data layer import
  • Elevation
  • Imagery
  • Raster and vector surface materials
  • Geo-specific and geo-typical models
  • Road, railroad, powerline, and hydrography networks
  • Supports sublayer import
  • Trench Generation Technology for VBS3

TerraTools 5.1 includes new automated trench placement technology that uses trench centerline data to rapidly and automatically place 3D trench models. A collection of trench component models representing both man-made and earthen trench types have been added to the TerraTools model library for placement. These trench models are fully functional in VBS3 and include destructible levels of detail. For added customization, users can import their own trench models into TerraTools for placement and export to VBS3.

TerraSim furthermore announces Building Designer, an upcoming stand-alone application that simplifies geo-specific building model generation. Building Designer will support common 2D and 3D building model formats for use across constructive, visual, and serious game runtime applications. For added customisation, users can export Building Designer models for enhancement in popular 3D modeling applications, such as 3ds Max and Oxygen.
At the show, TerraSim unveils Xtract for VBS3, a new service for constructing correlated terrain content from existing VBS terrains. Through this service, TerraSim can extract geospatial content – including elevation, phototexture, surface mask, vector content, and more – from legacy VBS environments for customers to then enhance and reuse in newer terrain environments.

I/ITSEC 2015: Thales SAGITTARIUS Evolution - A Small Evolution in Small Arms Trainers

With more than 260 operational systems worldwide, Thales’ SAGITTARIUS Evolution is a small arms trainer that covers all areas of marksmanship training from law enforcement applications, close quarter combat up to full military battlefield engagement training. Using the latest CryENGINE rendering technology provided by Crytek, SAGITTARIUS Evolution provides both virtual and live firing training capability for individual and unit training.

Using the latest CryENGINE rendering technology provided by Crytek, Thales' SAGITTARIUS Evolution provides both virtual and live firing training capability for individual and unit training. (Screenshots: Thales)
SAGITTARIUS Evolution has a flexible and scalable architecture across the entire product line, hereby being able to be linked to other systems and/or additional modules, such as the door gunner, the vehicle, or the boat module. SAGITTARIUS Evolution features a new component-based architecture, which enables it to be flexibly configured to the needs of the user. Thanks to a common platform for various training modules (e.g., basic shooting skills, land, air and marine equipment, joint fire support team), the system is scalable. SAGITTARIUS-Evolution's individual modules and the complete systems can be networked, making it possible to provide training for combined scenarios both at the local site and in concert with other sites.


CryENGINE 3 provides very real video rendering, which in conjunction with the integrated 3-D sound, provides a realistic training environment. Furthermore, an integrated training management system makes the system very effective for users. The available features help reduce trainers' work load by taking care of some of the preparatory and post-training tasks. Thales went ahead based on the CryENGINE source code and introduced real physics influenced ballistics into the “game”, making it more valuable for law enforcement and military customers.

The built-in artificial intelligence (AI) of the computer-generated forces (own, neutral and enemy) raises the training and awareness impact: The situation changes depending on what the user does; realistically simulated scenario conditions (e.g., terrain, time of day, environmental conditions like wind strength and direction) affect the way the simulation evolves and require the user to change his or her behaviour, which makes the training very real. New training concepts using AI for randomised avatar behaviour also give a better outlook on the situational, perceived, and behavioural threat. Furthermore, SAGITTARIUS-Evolution enables dynamic trainee-avatar-interaction using the trainee’s natural voice as simulation input, allowing for complex scenarios like ID check, vehicle inspection, and search of buildings/people.


Last year, Thales Deutschland was contracted to integrate a new weapon into SAGITTARIUS Evolution: The new multi-purpose shoulder launched weapon, the so called Wirkmittel 90 (aka Dynamit Nobel Defence’s [DND] RGW 90), is a new intelligent weapon, developed for the Bundeswehr, including an attachable Fire Guidance Visor (FGV), used with different types of ammunition, all within the group of the Wirkmittel 90. The Thales Deutschland site in Koblenz was contracted to integrate this weapon in the existing small arms trainer of the Bundeswehr (SAGITTARIUS). The simulation version of the Wirkmittel 90 was made available prior to the Initial Operation Capability (IOC) of the real weapon. For this purpose, the small arms trainer was updated to the newest simulation platform (SAGITTARIUS Evolution) to provide a close to reality training experience. During the integration of the Wirkmittel 90 the built in intelligence of the FGV with its weapon sequence control and the control of the display symbology had to be rebuilt for simulation use as a first step. Afterwards the initial ammunition types (blast/explosive fragments) were integrated in the simulator. The customer was very satisfied by the results and contracted Thales as a follow on order to integrate further ammunition types, e.g. fog, lightening and anti-structure missiles.
The results of the project shows that the new generation SAGITTARIUS Evolution as a simulator provides a valuable contribution for training in use of complex weapon systems like the Wirkmittel 90, especially when simulated training is to be provided in parallel to the introduction of the original weapon. The benefit of an optimised and timely weapon training might be the cutting edge that leads to a successful mission accomplishment within critical operations.

The SAGITTARIUS Evolution Mobile system, build in MIL-standard certified boxes – easy to move -, compromises building blocks for a modular system that fits the training need, enabling up to two independent trainees with primary- and secondary weapons at the same time. The use of tethered and wireless weapons is possible, whereas currently the tethered version is the preferred training method. The system is ready to train in 30 minutes, which includes set-up.

SAGITTARIUS-Evolution enables dynamic trainee-avatar-interaction using the trainee’s natural voice as simulation input, allowing for complex scenarios like ID check, vehicle inspection, and search of buildings/people. 



I/ITSEC 2015: Database Generation for Maritime Simulation

Simulation users have become accustomed to the high visual quality provided by today’s computer games, which led to increasing importance of serious games in the simulation and training industry. However, with traditional tools and modelling techniques results are not as convenient, as simulation projects have constrained budgets, especially taking the demanded large virtual terrain dimensions into account.

TrianGraphics uses Delaunay triangulation from the exact coast line, depth contour lines, areas and depth points (soundings), to form a fully featured database targeting maritime simulation. (Screenshot: TrianGraphics)

The Berlin-based company TrianGraphics has developed a novel database generation system that meets these demands for quality and quantity through an extraordinary high level of automation. Besides traditional landscapes for flight, combat, or driving simulation, Trian3DBuilder now also supports large-scale maritime terrains for simulation.

A typical terrain project is set up by applying a multitude of input data, typically satellite imagery, and height and vector data in miscellaneous formats. Depending on the input attributes generation features can be applied and a terrain is written for a visual database with Meta-data for various additional simulation tasks like computer generated forces (CGF).

In terms of maritime simulation, so called ENC vector data is imported, containing all information that can be found in nautical charts. The standard formats used are S57 or the encrypted S63 format. The data is sorted based on the ENC code and all attributes are used on import for post-processing and preserved for later use.

The seabed is Delaunay triangulated from the exact coast line, as well as depth contour lines, areas and depth points (soundings). This is combined with a multitude of further terrain generation techniques to form a fully featured database targeting maritime simulation in a matter of only a few hours setup time. Buildings are created from footprints, rails, and profiled roads, while junctions are automatically created from middle lines and canals; furthermore, rivers and seas are cut in the terrain mesh. The terrain is further “beautified” by adding vegetation like huge forest areas and specific models using point object placement.

ENC data is imported and combined with additional data sources to create a fully featured terrain of unlimited size. (Screenshot: TrianGraphics)

Especially important for nautical training is the correct placement of buoys. Each buoy is imported with specific model and topmark, as well as exact light assignment including colour, direction, range, intensity, and blink codes. Piers and shoreline constructions are also added.

When having imported all data and set up the project, the data can optionally be edited and enhanced in the database generation system. The result is optimised for real-time rendering and can be exported to a variety of well-known standard formats like OpenFlight, FBX, or VBS.

Modern software tools like Trian3DBuilder drastically simplify the generation process for large-scale 3D terrains. Now, maritime simulations with their very special demands are also targeted, enabling users to generate densely populated landscapes of unlimited size. This demand cannot be fulfilled with traditional modelling tools through technical limitations, and even less due to the huge amount of handiwork that would be needed to be invested. TrianGraphics’ solutions meet all demands for quality and quantity, giving the user what he really needs.

For more information please see MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.

I/ITSEC 2015: Leidos Showcases Integrated Solutions to Simplify Customer's World

Leidos at I/ITSEC 2015 showcases:

  • A full-scale, modular airborne demonstration featuring a platform-independent foundation for customised ISR collection, processing, and analysis;
  • Visualisation and data analytics tools capable of providing advanced threat detection to accelerate the delivery of real-time, forensic intelligence to decision makers;
  • Advanced cybersecurity solutions that can help network operators to see beyond their boundaries, increase response time, and stop the threats before the damage is done;
  • Singular, open-source architecture tools offering a user-friendly solution to database consolidation for enhanced soldier agility; and
  • Innovation in Health IT and human performance solutions offering advanced healthcare solutions for the US Defense Department's 9.6 million military beneficiaries.

I/ITSEC 2015: Elbit Systems to Deliver Combat Training System to Poland’s Special Forces

Elbit Systems has been chosen to deliver live Combat Training System (CTS) and equipment to Polish SOF. It is an advanced, highly-realistic simulations-based trainer that has been specifically designed for marines and SOF, ranging from individual soldiers up to full-size units. The CTS can be used to simulate a wide array of operational conditions and real-world elements, including those found in urban, rural and indoor environments.

Elbit Systems’ Combat Training System, in cooperation with RUAG Defence and Autocomp Management, brings great realism and immersion to special operations training. (Photo: Elbit Systems)

As part of the project, Elbit Systems, in cooperation with RUAG Defence and Autocomp Management, will deliver live-fire training suites for soldiers, vehicles and vessels, simulation equipment for use in training facilities, as well as inert explosives (simulants), grenades and suicide bomber kits. In addition, it will provide Digital Video Recording (DVR) and data-link capabilities that enable network training to and from a centralized management control, across all training forces and equipment. This will make it possible to record, monitor, and post-analyse the entire training process.

Elbit's CTS was chosen, among many competitors, due to its superior performance and reliability,” Yoram Shmuely, General Manager of Elbit Systems' Aerospace Division explained. “We are very proud to partner with RUAG Defence and Autocomp Management in this project, and hope that it will further strengthen our cooperation in the field of land-based training and simulation, both in the region and beyond.”

For more information please see MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.

I/ITSEC 2015: Virtual Reality and Real World Combined in Training

The increasing training requirements of users and the need for a wider approach to simulator training across service branches constantly sets increasing requirements for the networking of simulators.

Patria’s competence and experience in the field of training systems, achieved through long-term efforts and perseverance, date from the integration of the national data link system with Finland’s HORNET simulators. The network expanded over the years and its technology was updated to High Level Architecture (HLA). These years spent working with the national data link system have built a strong foundation of competence for Patria in a challenging operating environment.

(Graphic: Patria and Finnish Air Force)

Today, Patria demonstrates its expertise in training systems by the networking of increasingly distributed training systems. If required, the HLA standard enables the implementation of a joint virtual training environment for soldiers, providing fire control data on the ground and fighter pilots flying through the skies. HLA also makes joint exercises between the simulators of different service branches possible. At the moment, the HLA standard is published by the IEEE and NATO (STANAG 4603).

In 2014, Patria supplied the Finnish Air Force with a system solution that enables the use of realistic high fidelity Computer Generated Forces in HAWK simulator training. This technological solution enables the networking of HAWK simulators with HORNET simulators and, in the future, with modern anti-aircraft simulators.

With the help of a networked training system based on a virtual environment, all personnel materially involved in the application of firepower can be trained in the same exercise scenario. A good example of such a scenario is artillery fire control, in which the observers, the desired extent of the indirect fire command chain and, if necessary, weapons system simulators can all be involved in the same exercise. In such training environments, decision-making and the command chain are implemented according to the actual organisation and using actual command systems.

Embedded Training

The boundary between the simulated and real worlds is disappearing, as systems in operative use enable the production and reception of simulated information in real-life training scenarios. For example, Embedded Training capabilities integrated with defence systems enable the execution of extensive air combat exercises against entirely synthetic threats. This enables the exercise to be carried out without tying a significant number of fighters and pilots down in flying as the opposing forces.

The implementation of a system as described in the previous example requires setting up a synthetic threat environment and the transmission of the data generated by it between the aircraft and a ground station. As an added benefit, in addition to the significant cost savings, performance characteristics of the desired threats can easily be implemented for the synthetic targets, which would be impossible when using an opposing force consisting of own fighters or jet trainers.

Embedded Training can also be used to expand the capabilities of the users’ systems by simulating an entirely virtual radar for jet trainer that reacts like an actual radar to real aircraft and synthetic threats in the same exercise network. The creation of virtual weapons systems that can be used to engage targets in the exercise network is also possible. In weapons systems, embedded training capabilities enable practising the use of modern weapons systems with computer-based interfaces in training mode without expending ammunition or the need for the precautions and shooting ranges required for live fire exercises.

At Patria, our challenge is to integrate systems implemented with different technologies into seamless interoperation, while simultaneously creating training scenarios that are realistic but might be unfeasible to execute in real life. Patria’s customer-oriented mode of operations provides the conditions for success in technically complex yet cost-effective projects.

For more information please see MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.

I/ITSEC 2015: Cubic Showcases Effective and Efficient Solutions

Cubic Global Defense (CGD), a business unit of Cubic Corporation, demonstrates a range of innovative technologies developed to increase the combat readiness of fighter pilots and sailors. Cubic showcases various air and maritime training solutions, including:

  • P5 Combat Training System (P5CTS): The P5CTS is Cubic’s latest generation air combat maneuvering instrumentation (ACMI) system for both fourth and fifth generation fighter aircraft. The P5CTS provides fighter pilots and range training officers with the ability to display training missions in real time and during post-mission debrief. The P5CTS is operational at more than 35 locations worldwide and has supported more than one million sorties. The F-35 will also be delivered with an embedded version of the P5CTS.
  • Immersive Maritime Operator & Maintainer Courseware: This courseware is Cubic’s demonstration of interactive procedural operations and maintenance training. Set within a virtual and immersive shipboard environment, trainees perform tasks in a realistic multi-user environment. Immersive Maritime Operator & Maintainer Courseware is a cost-effective solution allowing trainees to function within complex shipboard environments at a fraction of the cost. 
  • Aircraft Cabin Crew Door Operations: This demonstration features a personalized training avatar that provides commercial airline cabin crew with orientation and operational assessments. Training includes interactive mini-games to evaluate the trainee’s knowledge of key door components, and a four-phased approach to learn how to properly open the main door.
For more information please see MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.


I/ITSEC 2015: DiSTI Releases GL Studio 5.1

The DiSTI Corporation, a worldwide leading provider of graphical user interface development software, announces the release of GL Studio 5.1, the industry leading high-fidelity user interface development toolkit.  The newly launched UI development environment features customisable workflows that deliver ease of use for User Interface Designers while preserving the performance, flexibility, and safety that Software Programmers rely on. The GL Studio environment now enhances collaboration between these two development roles enabling a seamless, intuitive, and efficient user interface production pipeline.

GL Studio 5.1 gives UI Designers more control during the development stage with new project specific workflows. The customizable project workflows, created by the Software Development team, facilitate UI development compliance with the target system. This new version also offers designers a simplified importing process with support for drag-and-drop art assets into the GL Studio Designer.  GL Studio instantly creates project specific user interface elements based on the file type. The designer workflow and runtime maintain awareness of any art asset manipulations made after importing the content. If GL Studio senses source file changes, the user interface assets update while preserving all post-import manipulations dramatically reducing the impact of art changes.  UI Designers can also instantly preview and test UI functionality using any modern web browser with a one-click publishing step to WebGL.

GL Studio 5.1 enables Software Programmers to create custom controls and objects for the GL Studio Designer with project specific scripts. The Programmer’s customized controls and objects extend the out-of-the-box functionality of GL Studio allowing them to be in control of their project workflow.  Once configured by the Software Development team, UI Designers use these objects and controls in a code-free manner to produce the hardware-ready user interfaces. Developers will also benefit from one-click publishing to their target hardware via the project specific scripts which significantly reduces design iteration cycles.

“Our new customizable workflow is a revolutionary UI development capability that clearly shows why GL Studio is the clear choice for producing high-fidelity displays,” Joe Swinski, President of The DiSTI Corporation, explained. “Our primary objective is to supply a fast, easy, and flexible user interface development environment, and GL Studio 5.1 delivers.”

Over 4,000 end users and 700 customers worldwide, including Boeing, Calsonic Kansei, Continental, ESG, Jaguar Land Rover, Honeywell, Lockheed Martin, NASA, Nissan, Raytheon, TATA, Thales, and Virgin Galactic, use DiSTI solutions to build safety critical and non-safety critical embedded HMI displays, maintenance trainers, create PC and Internet-based courseware and simulators. As a full service provider, DiSTI offers a complement of customer programming and development services, and is a recognised leader in training solutions for the global simulation and training community. 

I/ITSEC 2015: Meggitt Training Systems Launches New Virtual Reality Solution

Meggitt Training Systems introduces the FATS®100e, an evolutionary step forward from previous virtual-reality solutions and an enhancement of its FATS100 system, at booth 1238. The advanced FATS100e features Crytek-based 3D lanes, automatic coaching and VBS3-based collective training.

Meggitt said the 3D lanes provide visually realistic and highly detailed terrains and targets, including weather and visual effects such as wind-blown environments, birds in flight, dirt splashes and explosions. It also provides weapon-handling and shot-placement analytics, coaching tools that automatically highlight trainee results and enhanced graphic capabilities.

"Meggitt's introduction of the FATS100e demonstrates our continued investment in virtual reality solutions," Phyllis Pearce, senior vice president of strategy, sales and marketing, Meggitt Training Systems, said. "Attendees at I/ITSEC will be the first to experience the next-generation FATS100e, and we're confident our virtual training solutions will continue to play a vital role in the mission success of defense forces around the world."

The new FATS100e system solution is an extension of the proven and popular FATS M100 and a major expansion in weapons training capability, introducing new features such as Crytek-based 3D lanes, automatic coaching and VBS3-based collective training. The 3D lanes provide visually realistic and highly detailed terrains and targets, including weather, and striking visual effects including wind-blown environments, birds in flight, dirt splashes and explosions. Automatic coaching has never been available in the small-arms training market, and collective training is new to the VBS3 system.

The new system provides an impressive array of functionality for both instructor and trainee, delivering solid weapon handling and shot placement analytics, coaching tools that automatically highlight trainee results for reinforcement or correction, and enhanced graphic capabilities for an all-encompassing immersive training platform. With the new features and expansions, FATS100e will offer unmatched training advancements delivered per US Army and US Marine Corps instructional requirements, including the latest technological developments to meet the needs of combat forces worldwide.

For more information please see the Meggitt Training Systems interview in MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.

I/ITSEC 2015: Simulation and Training Bosses (SATB) Series - Selected Industrial Views

On the occasion of I/ITSEC, MILITARY TECHNOLOGY publishes the Annual Simulation and Training Bosses (SATB) Series that conveys the thoughts and messages of the world’s defence simulation and training leaders, according to the question, “What drives your business to develop new innovations for national defence and regional security obligations as well as for global export?”

----------------------------------------------------------------

CAE

Gene Colabatistto, CAE Group President, Defence & Security
What drives CAE can really be found in our company vision – to be the global training partner of choice in helping our customers enhance safety, efficiency, and readiness. For the defence market, it is really the “readiness” component that is most critical and defence forces around the world are constantly challenged to cost-effectively maintain readiness.

One of the ways to be more safe, efficient and cost-effective is to increasingly leverage simulation-based training. We are already seeing the balance of live and virtual training shifting more toward virtual training, and you can see this evidenced in the strategy and direction of many militaries, such as the Royal Canadian Air Force’s Simulation Strategy 2025. This is not to say live training is going away, but we do see defence forces taking a very close look at the overall training enterprise and asking themselves whether their current training approach allows them to most efficiently and effectively accomplish their objectives. As a company specialising in training and serving as a training systems integrator, we can be a good partner to defence forces looking holistically at their training enterprise.

We are also driven to continually invest in new technologies and innovations that will enable intelligent training and help our defence customers accomplish more of their training in a virtual world. To do this, simulation-based training needs to be more networked, more interoperable, and more immersive.

Defence customers have been consistently saying that they expect the use of simulation to grow in what many call distributed mission operations.  It is simply cost-prohibitive and incredibly time-consuming to conduct massive live training exercises, so we certainly expect more “virtual red flag” type of exercises to become more commonplace. A great example was the Coalition Virtual Flag exercise that took place this past summer between the US, UK and Australia in parallel with Exercise Red Flag that was happening at Nellis AFB. CAE supported the Royal Australian Air Force in linking and networking simulators in Australia to the live exercise taking place in the United States – a true example of live-virtual-constructive training systems integration.

CAE is investing in R&D that makes virtual environments more integrated and immersive so that these kind of distributed mission training exercises deliver the expected benefits. In fact, most of the simulators and training devices we deliver are fully capable of being networked; now it is up to the defence forces to leverage these capabilities into their overall training enterprise. CAE is actively promoting standard architectures like the Common Database (CDB) that help provide the foundation for making networked, distributed mission training in simulation more routine.

Coalition Virtual Flag exercise involving the Royal Australian Air Force. (Photo: CAE)
------------------------------------------------------------------

Cubic 

Bill Toti, Cubic Global Defense President (Photo: Cubic)
What drives our business?   In one word: The warfighter.

I remember back to my early days in the US Navy submarine force. Unless there was another submarine for us to train against, the quality of our anti-submarine warfare training was pretty poor. However, by the time I was a submarine captain, simulation training had improved to the point where I could go to the sonar room and inject a simulated target into the picture, and my crew could not tell whether it was a real submarine or a simulated one. Thanks to such advancements, the quality of our training greatly improved, while the cost significantly declined. In other words, you no longer need a real submarine target to have quality training any more!

At Cubic, we do not do submarine training systems (yet). Our current focus is on ground and air combat forces training, and I realised immediately that great opportunity lies in bringing ground and air combat forces up to the same level of simulation fidelity that we enjoyed in the American submarine force.

This kind of insight drives us to create new solutions for our global market. We take a holistic view of our portfolio and look to design and field solutions that will benefit the widest customer base. Our existing global footprint also affords us the opportunity to develop close relationships with our customers through our regional businesses and field offices. These relationships yield many insights into the best and most pressing innovative products that we can offer.

Our holistic view guides the management of our innovation portfolio which is distributed across near-term investments and longer-term “game-changing” technologies. Our near-term investments have a shorter time horizon, often one to two years, and allow us to field upgrades and new capabilities quickly to our current customers. We are constantly looking for ways to offer greater usability of our systems and often co-create with customers to bring these capabilities to bear.

This is not just “good business” for us – but it directly benefits our customers since many of them are required to interoperate at international training events and field exercises.

Cubic’s embedded instrumentation provides an air combat training system for the F-35 LIGHTENING II, for which the company has developed a special internally mounted version of the P5CTS/TCTS designed for the unique F-35 environment. (Photo: Cubic)
------------------------------------------------------------------

DiSTI

Joe Swinski, President DiSTI (Photo: DiSTI)
The age of millennials is upon us. Children born and raised in a world where they have always used computers are now entering the services or the workforce, replacing baby boomers at an ever increasing rate. Millennials are digital natives that have used computers for education and entertainment purposes since they were first capable of holding a mouse. Using computers and technology to learn and to expand their knowledge domain is second nature to Millennials. At DiSTI, we strive to expand the technology envelope to deliver an immersive and engaging training experience for this tech-savvy generation. This includes developing compelling, immersive, and interactive 3D training solutions that will teach a future generation of maintainers.

As our weapons systems grow in complexity and sophistication, our training methods for delivering lesson materials need to continue to improve. For the past decade, The DiSTI Corporation’s focus has been on how to create these 3D virtual maintenance environments in an efficient semi-autonomous fashion. Going beyond making a pretty picture, efficient virtual environment creation means providing support for project requirements analysis, source data management, and automated software builds and regression testing. DiSTI delivers this capability in our new development toolkit, VE STUDIO.

To date, this tool chain has developed dozens of environments for maintenance trainers ranging across a variety of platforms including jet fighters, cargo aircraft, surveillance aircraft, attack and cargo helicopters, unmanned drones, naval vessels, submersibles, and tactical vehicles. Among the many benefits of these training devices is the ability for maintenance personnel to work in teams, just as they will on the flight line. The environments allow the maintainer to work with fully autonomous teammates or networked with other live maintainers. The success of this approach has proven itself time and again over the past decade, most notably in F-35 Aircraft Systems Maintenance Trainer (ASMT) where maintainers were learning how to repair the aircraft before the first squadrons were even fielded.

Staff Sgt. Guin Duprey I, of the 31st Test and Evaluation Squadron, Edwards AFB, FL, familiarises himself with the F-35 using the desktop virtual-reality aircraft systems maintenance trainer, as well as a laptop loaded with joint technical data that is used for flight line operations. (Photo: USAF/Maj. Karen Roganov)
------------------------------------------------------------------

Elbit Systems

Bezhalel (Butzi) Machlis, Elbit Systems President and CEO (Photo: Elbit Systems)
In recent years we are witnessing a proliferation of simulations-based training worldwide. This process includes a shift from a longstanding training concept that separates between training of individuals, and command and control training to a combined approach that increases force interoperability through the use of Mission Training Centers (MTCs).

Elbit Systems' comprehensive scope of MTCs answers that need.

At the heart of this is Sky Breaker – a networked training centre for tactical training of combat formations. The centre is considered a world-leading solution for virtual-constructive (VC) mission training.

Other Elbit MTCs include the GMTC (Ground Mission Training Center) for advanced tactical training of ground forces, and the ICTT (Incident Command Team Training) for training of homeland security personnel.

MTCs had been delivered to other clients across the globe.

What is MTC (Mission Training Center) and what are its key advantages over other training systems?
MTC’s strength is its ability to facilitate a large-scale arena with a high level of interaction between all training elements, while also ensuring an intense and sustained individualized training session for each trainee. In addition, the MTC enables forces to perform mission rehearsals with much greater realism, allowing them to enhance operational readiness. Another important advantage is its operational flexibility, as it can easily be adjusted to evolving challenges.

Our MTCs are able to deliver these capabilities through the use of cutting-edge simulation and training technologies, which include:

  • Powerful computer generated forces (CGF), based on Artificial Intelligence (AI) technology – dramatically reducing the operating costs of systems and platforms. 
  • Integrated C4I systems and real radio, capable of creating and delivering a realistic representation of the entire C2 process.
  • Custom-fit trainee stations which enable flexible and customised training of all combat teams.
  • Site management systems for organizing and monitoring large-scale training sites and their resources.
  • Advanced editors capable of changing the training settings to meet rapidly evolving mission requirements.
  • Data gathering tools for carrying out effective after-training analyses, at both individual and collective levels.


Incorporated into Elbit's “One Sim” infrastructure, these technological capabilities have proven and continuously push the boundaries of simulative Mission Training. MTC facilities help us to continue develop industry-leading solutions for both present and future projects.

(Photo: Elbit Systems)
------------------------------------------------------------------

Lockheed Martin 

Lockheed Martin Mission Systems and Training (Photo: Lockheed Martin)

Amplifying Simulation to Redefine Military Training

As militaries continue to face tough budget decisions while maintaining readiness in an evolving global security environment, the training industry simply must advance technology to support more realistic and robust training. Lockheed Martin innovates solutions to address our customers’ most challenging problems because we know that simulation saves money and training saves lives.

Training System Integration

As a systems integrator, Lockheed Martin develops training programmes that provide the shortest path to learning. Through our Human Performance Engineering methodology, we apply a “family of systems” approach where we match technology to each learning objective with a focus on continually improving student outcomes. This philosophy underpins our F-35 Training System, the world’s most advanced simulation-based training environment. Through next-generation programmes like the T-X, we see increased opportunities for our holistic approach that combines leading-edge technologies into agile, low-risk systems for mission readiness.

Turn-key Training 

Taking training system integration to the next level, our turn-key training programmes provide performance-based training solutions delivered as a service. We partner with our customers and bring all components of the training system together including simulators, facilities, instructors, curriculum, live training platforms and financing. This approach establishes a predictable training cost over decades with built-in modernisation. The result is increased graduate skill sets, shorter training times and more affordable training, all while exceeding performance requirements.

Integrated Live-Virtual-Constructive Training 

We see tremendous potential to provide affordable mission readiness by increasing ground-based training and decreasing live training, including through the use of integrated Live, Virtual and Constructive (LVC) environments. Preparing military personnel for the complex challenges and threats on the horizon requires the ability to train virtually on scenarios that can’t be affordably replicated in live environments. Such training requires integrating LVC elements into one realistic combat experience. For instance, LVC enables pilots to fly within existing, limited physical airspace constraints while simulating challenges outside of that physical airspace for a broader training envelope.

Customers frequently tell us about the importance of effective training in today’s complex operational environment. Often, their stories tell us how our training technologies helped save lives or accomplish their mission flawlessly. That’s why we are intently focused on amplifying the power of simulation to redefine next-generation training.

The UK’s Military Flying Training System delivers a modern, streamlined flight training solution for the British RAF, Royal Navy and Army Air Corps. This turn-key training programme is a partnership between the UK MoD and Ascent, a joint venture of Babcock International and Lockheed Martin. (Photo: Lockheed Martin)
------------------------------------------------------------------

Meggitt Training Systems 

Ron Vadas, President Meggitt Training Systems (Photo: Meggitt Training Systems)

For more than 90 years, Meggitt Training Systems has led the global-training arena through our legacy companies FATS® military and law enforcement virtual weapons training and Caswell International’s military and law enforcement live fire systems.

Our customers must complete diverse missions under unprecedented budgetary challenges. They need cutting-edge solutions to prepare them for the field. To train military and defence forces around the world, Meggitt offers the FATS M100 small-arms trainer, an open-architecture system that provides for efficient integration of evolving simulation technologies and software products, allowing for customisation and a wide array of options, including enhancements with multiple through-sight devices and multiple image generators; support for up to 20 weapon simulators; a shot analysis display that identifies weapon aim points, shot location and rounds fired; night-vision training; and after-action review.

The newly announced FATS 100e system solution is an extension of the FATS M100, as well as a major expansion in weapons training capabilities. The next-generation system’s new product enhancements include Crytek-based 3D marksmanship, automatic coaching tools and VBS3 collective training.

We’re always looking toward the future, and we have our eye on several innovations for upcoming solutions, including home station training, trainee immersion and virtual/live-fire integrated training.
There is a big push to keep units proficient and ready without the expense of pre-deployment training. Home-station virtual training provides cost-effective solutions to live training, especially as an alternative to unit deployment to a distant location.

For trainee immersion, the intent is to use various elements to increase immersion in the virtual training experience to suspend disbelief and create physical responses similar to those in actual events. A full suite of immersive tools would be used to engage all senses during the virtual training event.

In the Middle East there are requests to fuse training situations, including virtual scenarios and live-fire weapons in live-fire shoot houses, while in Asia, we were the single provider of a comprehensive training complex with a multi-floor, multi-range concept.

These are the kinds of innovations that will help ensure Meggitt remains a world leader in the global training arena.

The FATS® M100 simulator series supports multiple, simultaneous simulation and training modes using a flexible architecture that allows customisation of Meggitt Training Systems' courseware and systems. This allows the easy integration of evolving simulation technologies and software products. (Photo: Meggitt Training Systems)
------------------------------------------------------------------

Raytheon 


Bob Williams, Raytheon Global Training Solutions Vice President (Photo: Raytheon)
Our company’s vision is to create trusted, innovative solutions to make the world a safer place. We take pride in preparing people for the world’s most important missions. We do this by developing and providing the most effective training solutions to ensure our customers’ success because failure is not an option.

In his recent presentation at the AUSA annual meeting, Army Secretary John McHugh urged more efforts to educate the US Congress about real-world challenges the US Army faces. He also encouraged a stable, predictable budget so that they can strategically plan for training to meet these challenges.

In this age of budget uncertainty and facing adversaries with technology comparable to ours, training will continue to give our warfighters a decisive advantage on the battlefields of tomorrow. As weapons systems become more complex, warfighters have to learn how to use this technology not only as effectively as possible, but also in new ways. Raytheon does this fully leveraging immersive training environments. By blending live, virtual, constructive and gaming domains in ways that also optimises scarce training resources, we not only provide the most realistic and comprehensive training environments possible, we also harness training costs. Since 2008, we have trained virtually every US Army soldier while saving more than $400 million in training sustainment costs.

We also leverage technology to add verisimilitude to training environments, using battlefield special effects that teach soldiers how to best leverage technology to fight and succeed. For example, Raytheon has refreshed a 32-acre Combat Town in central Louisiana with more than 700 sensors, cameras and special effects to better prepare soldiers to win in urban environments.

We have also helped develop a mobile, linked training system, called the Joint Pacific Multinational Readiness Capability, which allows units to train virtually and constructively together, even when geographically separated by hundreds of miles. Raytheon has also created a video game-based trainer to help Army PATRIOT crews learn how to re-load missile batteries.

By leveraging technology and working to take cost out of training, Raytheon can help ensure those who protect us get the best training possible.

------------------------------------------------------------------

Rheinmetall

Ulrich Sasse, Managing Director of Rheinmetall Defence Electronics, President, Simulation and Training Division (Photo: Rheinmetall)
Simulation and training have been an integral part of Rheinmetall’s business for more than 40 years. During this long period of time the Group has never stopped innovating, enabling it to meet ever-more demanding customer requirements while simultaneously embracing the latest technology. This culture of innovation has made Rheinmetall one of the world’s leading suppliers of military simulation and training solutions.

With military systems becoming increasingly complex, customers expect cost-effective training solutions during the entire product lifecycle. Rheinmetall offers customised solutions ranging from basic e-learning and part-task training right through to sophisticated simulators and complex training centres. This enables the Group to adapt flexibly to customer budgets while maintaining the high quality standards associated with the words “Made in Germany.”

An important driver here is the changing global security landscape. When it comes to training, armed forces all over the world are focusing more and more on mobility, foreign deployments and joint operations. Rheinmetall already anticipated this trend several years ago, launching products and services that meet these requirements, including a mobile combat training centre solution in its LEGATUS product line, a deployable set of joint tactical training cubicles known as ANTARES and the Group’s total ship training concept.

Technology is advancing in other industries, too. Rheinmetall engineers continuously monitor and evaluate all the latest trends, while the Group’s sales force discuss their possible application in future products with our customers. This drives the Rheinmetall innovation process enabling the Group to combine developments such as virtual reality or popular gaming databases with superior simulation models based on unsurpassed mathematical precision and realism. This way, customers benefit from high-end military simulation and state-of-the-art technology.

Rheinmetall understands that its future strategy needs to be shaped in close co-operation with its customers. In an uncertain world, customers need a reliable partner able to deliver training solutions that assure excellent performance by soldiers, airmen and sailors when it matters most. At the same time, Rheinmetall is committed to thinking ahead, anticipating future security scenarios and enabling customers to contend effectively with emerging threats and mission requirements.

IDZ-ES with LIVE simulation equipment. (Photo. Rheinmetall)

------------------------------------------------------------------

RUAG

Oliver Meyer, Senior Vice President Simulation & Training, RUAG Defence  (Photo: RUAG) 
We are driven by the constant changes and uncertainty surrounding the global security situation. Forces today are required to be more agile and prepared to respond across a wider spectrum of conflicts, ranging from internal conflicts over hybrid warfare to a conventional “show of force.” We see our role as supporting both our domestic and international customers to be ready to face this changing world, by providing innovative new technology and solutions to make the training of our customers even more effective.

Besides state-of-the-art equipment, soldiers’ skills and experience are crucial to a force’s ability to respond. However, training must now cover more requirements with lower budgets and less available time – effectiveness and efficiency is how we respond to this challenge. Modern training systems have to prepare all levels of the command chain for all possible scenarios. Additionally training has to permanently accompany soldiers and command staff over the mission preparation cycle and the mission itself  To do this a customised mixture of Virtual, Live, Constructive and Embedded solutions is key for sustainable, flexible and effective training. The speed of modern operations, especially “military operations other than war” (MOOTW), is forcing our partners to need a comprehensive approach to training and a robust system to promptly integrate lessons learned into their training doctrines – and get their people trained quickly. An effective combination of realistic training and accessibility for all users remains a challenge for many forces, and so an area we are looking to innovate.

The RUAG Defence training systems provide a high end level of fidelity and quick accessibility, the premise for our development is the idea “The soldier should not come to the Training Solution, it should come to him.” Mobility, interoperability and usability are the key words, which drive our innovations for the future. These features are to be found throughout our whole portfolio beginning at our mobile virtual shooting range SITTAL, over our Mobile Live Training, through our Constructive Training capabilities, and up to our Embedded Training Solutions. RUAG Defence is the international partner for armed forces to enable effective training in an fast changing world in order to improve the soldiers performance and skills in every environment for all missions.

GLADIATOR can be extended in three modular stages, starting from the Basic variant. A high-tech and cost-effective harness for day-to-day training deployments. A new high-tech Swiss training system that has been developed for the realistic training of special units, the police and combat troops, from group to brigade level. It makes it possible to practice firing and movement within the widest possible variety of training scenarios in open and built-up terrain, without involving live ammunition. (Photo: RUAG)
------------------------------------------------------------------

TRU Simulation 

Ian Walsh, TRU Simulation + Training President and CEO (Photo: TRU Simulation)

As the world has grown more complex, the balance between peace and conflict has become more precarious. With strained defence budgets and the need for our warfighters to be better trained than ever before, simulation has an impactful role to play. The challenge is how to equip our service men and woman to do more with less and, at the same time, ensure their safety and preparedness—and that of our great nation and allies.

According to NTSA reporting, almost 60% of military tasks trained have Transfer Effectiveness Ratios greater than 0.33. So for every three hours spent in a simulator, one hour of actual flight time training for those tasks could be eliminated.  With a cost per actual flight hour of $5,000—for an F-16, for example—and a cost of $500 per simulated hour, the value of simulation goes straight to the military budget bottom line.

Then, there is the fact that pilot training is only one aspect of preparedness.

Maintenance technician training is equally important to overall mission success. Simulation here, too, offers cost benefits. Simulated training involves breaking the aircraft into lesser cost, non-flight worthy training devices. Training this way increases student throughput, lowering per student training costs and risk to aircraft.

Borrowing from civil aviation, ten years ago global civil helicopter accidents were increasing by 2.5% annually, with many helicopter accidents occurring in conjunction with pilot training. This led to efforts to gain greater acceptance of simulation for pilot training. Since 2006, the number of helicopter accidents worldwide has decreased by 2% annually.

Considering the nature and scope of defence operations, the magnitude of risk and cost of not engaging in effective simulated training is staggering. The question becomes: what makes for effective simulated training?

TRU Simulation + Training — and the industry — are using innovation to answer this question. Full motion flight simulators, maintenance training devices, state-of the art courseware and training programmes that deliver an affordable, more true-to-life experience significantly improve aircrew safety and proficiency.  Upset Prevention and Recovery Training now prepares pilots for worst case scenarios. Live Virtual Constructive Training can now effectively simulate all possible weather conditions, emergency and aircraft procedures, air traffic and crew communications, mission sets, and realities to produce the best flight teams in the world—hard to place a price tag on that.

TRU ODYSSEY H Visual Bowl Interior. (Photo: TRU Simulation)
------------------------------------------------------------------

VT MÄK 

Dan Schimmel, CEO, VT MÄK (Photo: VT MÄK)
The current global military climate features a dizzying array of complex challenges for our industry’s supply chain. Changing alliances, new threats, demanding requirements, and enticing technologies all need to be balanced in a daunting spending environment where customers want both better and cheaper solutions. Innovation is critical to meeting that challenge, and innovation is equally vital to VT MÄK’s business. It has been a driving force of our success since our founding 25 years ago as a simulation industry pioneer.

VT MAK’s core business is to supply most of the world’s defence contractors, training and simulation companies, and system integrators with a broad and compelling suite of COTS solutions for simulation, visualisation and networking applications. Because of the diverse breadth of our offerings, we enjoy a valuable industry perspective on what new military and security innovations are required nationally, regionally and globally.

This breadth has several important dimensions. With 50% of our business coming from outside North America, geographic breadth contributes to our wide perspective on changing global customer requirements. Second, force and domain breadth plays a key role. VR Forces, MAK’s flagship simulation software, spans all military domains--land, air, naval, unmanned systems, joint operations, covering a wide swath of strategic and tactical needs in live, virtual and constructive training. As budgets and customer requirements shift among these various domains, our product roadmap matches them in lockstep. Recent innovations illustrating this phenomenon include: more electronic warfare features, game-like visual realism, more UAV sensors enhancements, web and mobile access, and more solutions for command staff and homeland security training. All these improvements are delivered in an open, flexible platform. Finally, MAK’s programmatic breadth greatly assists us in developing new innovations. We frequently work with customers in the delicate embryonic stages of their planning and prototyping. We help market leaders test the viability of new systems, subsystems, weapons, vehicles, aircraft, UAVs etc. long before they are sold, built and deployed. At this early stage of the product lifecycle, the key innovation skills revolve around our people, what we have always called: “The engineer down the hall.” MAK’s engineers must listen, brainstorm, take criticism, and collaborate closely with customers’ technical teams to meet every hard innovation challenge.


------------------------------------------------------------------

For more information please see MILITARY TECHNOLOGY #12/2015, available at the show on booth #453; and frequently check back for more NEWS FROM THE FLOOR.

29 November 2015

I/ITSEC 2015: Visualisation Systems – Current Technology and Applications Under Review

Modern Visualisation systems can produce almost anything that is required, the trick is how to display it. Ian Strachan looks at the technology and how it contributes to military training.


In the past, for visualising the battlefield or preparing for detailed action, there were maps, models, photographs, and live rehearsals that were carried out over similar terrain. Perhaps their most significant use was in June 1944 when hundreds of thousands of soldiers, sailors and airmen were trained and briefed before the D-Day landings in Normandy. How would we use Visualisation systems in such military operations today? In addition to maps and photography the key word is Simulation, which has almost entirely replaced the construction of models and is more versatile. Critical factors such as different weather conditions and likely enemy action can now be simulated before the operation, images can easily be switched from day to night. First-generation simulator visual systems consisted of “model boards,” over which small cameras travelled at heights and speeds appropriate to the vehicle being simulated. When models were detailed enough and high-resolution colour cameras were used, these were very effective and such systems had their devotees well into the era of computer-generated imagery. In the Soviet era, targets for NATO’s long range ground attack aircraft were far into Warsaw Pact territory and some very large model boards were constructed so that low-level “war routes” could be practised. But in the light of computing developments, model boards for simulator visuals have largely fallen out of use.

The first computer generated imagery (CGI) used in simulators of the 1970s was crude and little use for tactical training. However, during the 1980s the application of Moore’s Law resulted in imagery that was of genuine tactical use and now we see CGI that approaches the real world in fidelity. One of the breakthroughs that made this possible was the use of “texture” within the polygons that make up the computer-generated scene. The first textures allowed a reticulated pattern to be applied to each polygon instead of an otherwise plain surface. This reduced the need for more and more polygons, particularly in a rapidly moving scene such as in an aircraft simulator. The extra points of contrast in the scene that were provided by texture patterns, increased the magnitude of the “picture flow” or “change of perspective” cue that allows the user to sense both height and speed. Increase in computing power allowed more polygons in the scene, and texture technology quickly improved with more realistic patterns, for instance simulating grassland, cornfields and the like. Finally, “photographic texture” was developed. This allowed small but real images to be inserted into each polygon without the need to re-process the image each time, the “texture map” being simply inserted into a polygon as a single element after it had been created the first time. A large visually symmetrical area such as field patterns, woodland, rough sea, or a skyscraper can be created using very few polygons and one or two photo texture maps that are repeated over and over again. Turning to night imagery, vision devices include light intensifiers such as NVGs working in the near IR at a wavelength of about one micron. Also the more expensive passive FLIR, which depends only on thermal contrast within the scene and works on the blackest of nights. Both NVGs and FLIR are easy to simulate by simply adding monochrome colours and appropriate texture to each polygon. In the case of NVGs, monochrome green is normally used. With FLIR the thermal image can be presented as either “white hot” or “black hot,” where white-hot may be the obvious orientation but black-hot may produce a picture that is considered more realistic to the viewer. In any case, the picture can be changed from white- to black-hot at the touch of a switch.

Techniques like the above have allowed the incredibly realistic computer-generated scenes that we see today. Both real and artificial worlds can be created but for military training there is no substitute for the use of imagery of the real world. Since the corners of each polygon are defined as three-dimensional (x.y.z) co-ordinates, the stored database is three-dimensional and can be called up to display scenes that can be viewed from any angle. Up-to-date photographic and mapping data can be semi-automatically transformed into computer-generated imagery and in areas without direct or aerial photography, imagery and terrain data from satellites can be used. The results today are stunning. To the real-world visual scene can be added other images needed for training, such as vehicles, artillery, ships, aircraft, and personnel. It is also possible to combine virtual and real worlds in what is called mixed- or hybrid-reality, where physical and virtual objects co-exist and interact in real time. So in terms of generating scenes for visualisation, systems are now in place to produce whatever is required. The trick is how to display the imagery.


State of the Art 3D Models for Real-Time Visual Systems

The demand for Serious Gaming-based visual system databases and photorealistic 3D content for professional training simulators requires a competent and reliable partner. <RUAG Defence Simulation and Training, with its extensive knowledge in simulator construction, is now offering its know-how in creating 3D content and databases as a separate service and integrative solution for simulators.


Realistic high quality 3D real-time visualisation for training simulators has become an important and of significant value in the professional training education. The learning effect for the trainee and the adaptation to their virtual environment can be improved as they will not be distracted by any seemingly unrealistic influences. To integrate serious gaming technology in existing simulators does not necessarily result in the need to replace of the entire visual system. RUAG 3D content models can be cost-effectively integrated in many of the common 3D real-time visual systems, such as, e.g. VEGA PRIME, VBS2, VR VANTAGE, or Open Scene Graph.

RUAG Defence creates terrain databases, covering all of the known terrain types such as mountain areas, hill sides, flat areas, urban or desert areas. Depending on the training purpose and the training situation, the database can be built either based on an existing terrain or can be designed according to the customer’s requirement. The requirements and the size for a database can change significantly depending on whether the need is for a flight-, a driving-, an infantry- or an ATC-Simulator.

Static 3D content like bridges, tunnels, streets, including traffic signs, as well as buildings and vegetation are integrated parts of the terrain databases. Any 3D content contained within the terrain database will be optimised to guarantee a real-time visual system.


Display Systems

There is enormous variation in how imagery can be displayed, and the rather illiterate saying, “you pays your money and makes your choice,” applies. The choice includes TV monitors, projected displays, head-mounted systems, and distant-focus or “collimated” displays. Projected displays vary from small areas using one projector, to partial and full domes with many projectors. There are also so-called Complete Automatic Virtual Environment (CAVE) displays in which the subject is surrounded by large screens left, right, centre and above, giving close to total immersion in the visual scene. The subject or subjects stand and move in the CAVE and use trackers and sensors to manipulate the visual scene. In large area displays, most projectors are used for terrain and objects on the terrain, but extra “target projectors” can be used for specific aircraft or ground targets. Both forward- and back-projection can be used and a dome can have an array of between 10 and 20 projectors. An example of back-projection is the SimuSphere display system by Link USA, in which a pilot’s cockpit is surrounded by a number of flat “facets” on which outside-world imagery is back-projected. The field-of-view depends on the number of facets and can be up to 360°, the equivalent of a dome.


Systems like this work well for single-pilot simulators, such as for fighter aircraft, as the visual perspective can be optimised for a pilot’s eye-point. However, where two crew in a simulator are seated side-by-side, with a directly-projected display the perspective of the scene cannot be correct for both. If the eye-point for the visual display is selected to give the correct perspective for one of the crew, the other crew member will see some objects in the scene at incorrect angles. For game-type simulation this may not matter, but if the simulator is being used for critical tasks such as landing an aircraft or tracking a target, such errors need to be corrected.

To eliminate these errors in simulators for large transport aircraft and multi-crew helicopters, the Cross-Cockpit Collimated Display (CCCD) system was developed. Here, the two pilots view the outside-world imagery in a large curved mirror rather than a screen. A screen with the outside-world imagery is above the pilot’s compartment and its image is reflected in the mirror, which is what the pilots see. The screen and mirror are of wide horizontal extent, typically between 150 and 220 degrees. The secret that enables undistorted view from both crew seats is that the mirror has a small vertical curvature so that the image appears to the crew at a distant focus. The mirror may be 2/3m in front of the crew but the perceived focus of the image can be 100m or more, depending on the amount of vertical curvature. This allows both crew to see the scene with the correct perspective, with distant objects at the correct angles from both pilot’s seats. The word “Collimated” is used for such a display, derived from “co-linear,” implying parallel lines or infinity-focus. Mirror surfaces in CCC Displays use lightweight materials, such as mylar instead of heavy glass, and must be rigid enough to be compatible with movement of the 6-axis motion system that is required in Civil Full Flight Simulators (FFS). Military transport aircraft and multi-pilot helicopters usually use a similar Simulator design, but because of their more complex roles compared to civil airliners, training on the aircraft backs up the FFS. With multi-pilot helicopter simulators, in addition to the main display there may also be lower “chin windows” that give downward look for hovering. It is developments such as this combined with high-resolution imagery and well-matched motion that have allowed virtually all civil airline training worldwide to take place on a FFS rather than on the aircraft itself, with immense financial savings and less wear-and-tear on these expensive aircraft. The next time you are in an airliner, it is possible that the landing may be the first on that type by the pilot. However, he or she will have just completed intensive training on a FFS and will be supervised by a Training Captain in the other seat for several passenger flights until allowed to carry on without supervision.


Augmented Reality Update – A Significant Impact in the Battlespace

Augmented Reality (AR) involves projecting computer generated information onto the user's view of the real world. The last year has witnessed a number of significant developments in the military's exploration of the technology, both for training and operational uses.

AR has been routinely used in military aircraft for decades in Head-Up Displays (HUDs) and more recently in Helmet-Mounted Displays (HMDs). From weapon aiming symbols to artificial horizons to velocity vector bugs and obstacle warnings, computer generated information overlaid on the real world is a proven means of making military aircraft safer and more effective in combat. Inevitably, AR is moving into ships and ground vehicles, while powerful handheld and wearable computers combined with innovations in lightweight wearable displays are making inroads into the world of the dismounted soldier, which can be the most stressful, confusing and lethal of all. In all cases, both operational and training focused systems are under development. In all of these environments, the AR system must enhance situational awareness (SA) and avoid information overload.
AR's effectiveness depends on many factors, including the timeliness of the information, robustness of communications networks that provide input from multiple external sources, clear, comprehensible symbols accurately registered with the real world view, zero or near zero latency, accurate tracking of the user's movements and the quality of the display.


Training Maintainers, JTACs

The US Navy is continuing to evaluate the use of AR for training and assisting maintainers aboard ship through an initiative led by Lt. Josh Steinman that secured U$100,000 from the Chief of Naval Operations Rapid Innovation Cell (CRIC). The team used Google Glass AR glasses and developed smart phone applications for equipment maintenance that incorporated manuals and videos of real maintenance procedures. The system is not tied to the Google hardware, which is being discontinued.

Due to complete later this year, the five year Augmented Immersive Team Trainer (AITT) programme conducted under the auspices of the US Office of Naval Research (ONR) is intended to provide a “live simulated” training tool for ground-based fire support teams such as artillery observers, Joint Terminal Attack Controllers (JTAC) and Forward Air Controllers (FAC), a system that can turn any environment into a training range. AITT took a step closer to fruition on 21 May, when Marines used it on a Quantico golf course on which only they could see computer generated tanks, mortar fire and battlefield smoke.

Tracking users' head movements is more challenging in outdoor environments, particularly ones without pre-surveyed datum points. According to ONR, advanced software algorithms and multiple sensors enable AITT to determine the user's viewpoint accurately, while virtual aircraft, targets and munitions effects are inserted into the real view via the head-worn display. An enhanced instructor station drives training content, while performance assessment, scenario generation and scenario adaptation strategies are routed in the latest scientific research, says ONR.

Combined inputs from video cameras, inertial measurement units, GPS receivers, magnetometers and air pressure sensors track the users head movements. Virtual elements are then added to the real world scene viewed through the headset and through simulated tactical equipment including binoculars and the Vector 21B laser rangefinder.

A large scale demonstration at Quantico late this year is set to bring the programme to an end so that it can make the transition to the Marine Corps Program Manager for Training Systems for further testing and development.


BAE Systems, Augmenti Team for Vehicle AR

There is a more operational focus to an agreement announced on 15 February by BAE Systems Hägglunds and Norwegian AR specialists Augmenti. Their LoI covers technical cooperation on the development and implementation of AR into the CV90 and BvS10 vehicles, for upgrades to in-service platforms and implementation in new production vehicles for future projects. The LoI also covers the development and implementation of AR in a future Intelligent SA System (ISAS) for combat vehicles.

ISAS is to provide crews with better all-round vision by night and day round vision for vehicle crew, enhance their SA and combat effectiveness through the integration of information overlays, improve platform survivability and reduce crew workload.

The proposed ISAS solution includes multiple HD video and IR cameras with overlapping fields of view positioned around the vehicle, an HMD for the vehicle commander and driver, with peripheral devices such as tablets for the rest of the crew showing camera imagery and AR overlays. According to Michael Karlsson, an AR researcher at Sweden's Umea University, ISAS represents a particularly demanding kind of solution as the system has to track the commander's and driver's head movements independently, present the appropriate sections of the camera imagery to their HMDs and present appropriate, geo-registered symbology to each crew member, who are likely to have different priorities.

Augmenti is building a track record in military AR, having integrated it into Kongsberg's Protector Nordic, a variant of the market leading RWS developed for the Norwegian and Swedish armed forces. Video of a test conducted in March shows a laser rangefinder used to point out targets that are then sent to a BMS for intelligence about them to be added, after which AR symbols appear on the RWS operator's screen. The symbols used are NATO standard ones, first for an unknown contact and then for hostile infantry.

Hardware Agnostic ARC4 

Applied Research Associates (ARA) seems to have a well-developed product in its ARC4 software and is looking to partner with see-through display and mobile computing manufacturers to develop the system for a variety of military, government and commercial applications. ARC4 emerged from DARPA's Ultra-Vis programme, an effort to develop head-up AR for dismounted soldiers on which ARA was prime contractor, and has been tested with BAE Systems' Q-WARRIOR optical waveguide display and devices from Lumus and Vuzix as well as the Exelis (now Harris) Enhanced Night Vision Goggle (ENVG).

The core of the visual interface is a ring set low in the field of view that shows the user his or her position and heading and the relative positions of objects of interest around. Additionally, those objects have icons overlaid on their real world positions, icons that the soldier can interrogate for further information by looking at them. He or she can also add markers to new objects, which can then be shared with team mates' ARC4 displays over the network.

With the kudos of a successful DARPA programme behind it, slick performance and the ability to work with any mobile computing and display platforms, ARC4 seems set to have a significant impact in the land warfare domain.

With progress in computing, stored data bases can now be very large, the appropriate element being called up for display at any one time and then returned to store when the trainees’ eye-point has moved on. Where large-area imagery has to be produced such as for fast-jet flight simulators, a technique called “level of detail scheduling” is used to avoid processing unnecessary data for display. Here, distant objects and terrain are deliberately extracted from store at low resolution and displayed as such, the resolution automatically increasing as objects and terrain get closer to the subject.
If this is done properly, the differing levels of detail in the overall scene are not discernible to the user, but with simpler systems some discontinuities, such as “feature popping” can occur where some features suddenly appear in the scene as they get closer instead of growing gradually in size. As the stored data is three-dimensional because each polygon corner is plotted as x.y.z co-ordinates, this can be exploited in a number of ways. In a simulator the crew is presented with imagery from an “eye point” from which the scene is displayed with the correct perspective. However, at the instructor operating station (IOS) or exercise control (ExCon), different eye-points can be selected during the exercise and afterwards for debrief. The view from more than one eye-point can be shown such as those from the various entities in the exercise. A so-called “God’s Eye View” is where the whole database is viewed from above so that instructors and umpires can visualise tactical activities as they develop and introduce opposing forces, electronic warfare and so forth as required. Exercises are not limited to one site, wide-area network (WAN) links can be used over many thousands of kilometres and multi-Service and multi-national exercises can be carried out, after appropriate preparation. Visual presentations at ExCon can include video from crew stations, maps, montages of the tactical situation, in fact anything that might be required both during and after the exercise for analysis and debrief.

The versatility of computer-generated imagery is truly amazing, and in fighter aircraft simulators, when the simulator computer senses that the pilot is pulling high G, the field of view of the outside-world display can be automatically contracted to show the "tunnel vision" that in the real world precedes the highly dangerous condition known as G-induced loss of consciousness (G-LOC), so that fighter pilots can be prepared for this beforehand.

So, visualisation systems used in modern simulators are very effective, and in the military there has been a major change in attitude to training by simulation. For example, in the last UK defence review one overall aim was to achieve about a 50:50 balance between training by simulation and training using the real equipment. This 50:50 figure is not untypical of training today in many areas of the military, particularly in aircraft.

3D Visualisation Systems

Geoscience visualisation is a fast growing area, and visualisation systems enable geoscientists to communicate with each other and with end users from diverse disciplines to better understand complex and varied datasets.

3D visualisation provides a mechanism for communication. For example, BGS uses the latest software and hardware to visualise geoscience data in 3D and provide a mechanism for effective communication of BGS science. By using dedicated 3D visualisation facilities to run software such as GeoVisionary, geological understanding and risk/confidence is more easily conveyed. The 3DVS team has been involved in a number of high profile projects such as communicating the geological confidence of storing radioactive waste to visualising shale gas/oil rocks and their proximity to aquifers.

BGS has developed GeoVisionary in partnership with Virtalis. GeoVisionary is a geoscientific information system for visualisation and interpretation of geoscience datasets in a virtual reality environment.

GeoVisionary is software offering the ability to visualise comprehensively all of possible elements together in a single, immersive 3D stereoscopic environment, as well as on desktop PCs and laptops. GeoVisionary's powerful graphics rendering engine gives seamless, real time access to the entire data resource. BGS has created an add-in for ArcGIS that links the GIS with GeoVisionary, connecting traditional GIS with the 3D virtual landscape. GeoVisionary also provides simultaneous high resolution 3D visualisation of city models and geoscientific models.

By using Virtalis MaxExchange software (a plug-in for Autodesk 3DS Max) it allows CAD models to be easily imported into GeoVisionary.

For added realism in GeoVisionary projects, BGS can incorporate simple animations created in 3DS Max, such as flying aircraft or vehicles moving along roads.

Visualisation Systems in the Real World

Turning now to real military hardware, visualisation systems are now providing vital additional information. For instance, imagery on an aircraft Head Up Display (HUD) is a form of visualisation, where basic features such as attitude, airspeed and altitude can be added to weapon state and target data, including the optimum flight path to engage a target or evade a threat. First-generation symbology was basic, but now almost anything can be added, including outside-world pictures such as from night vision devices. Night low flying can now take place without the need for complex and expensive terrain-following radar (TFR), which may give away aircraft position to an enemy. Miniaturisation has led to Helmet-Mounted Display (HMD) systems that can enable the helmet of an aircraft pilot, tank commander, or soldier to be just as capable as a separate display unit, the display still being visible when the user scans left, right, and upwards.

Symbology and imagery can match the role and the challenge is to filter information so that what is displayed is relevant to the task rather than saturating the display with non-essential data. As the phase of an operation changes, the displayed data can be changed, then changed again, to what is relevant. The BAE Systems STRIKER series of HMDs are examples, fitted to some Eurofighter TYPHOONs and Saab GRIPENs. The STRIKER II includes night vision cameras so that separate NVGs do not have to be worn. The challenge with HMD systems is to lower weight and rotational inertia, to reduce loads on the pilot’s neck under high G loadings or when scanning rapidly. BAE Systems is currently working with the University of Birmingham in the UK to develop lightweight systems. Certainly, some sort of light eyeglasses may be possible and some people are even forecasting contact-lenses with an imaging capability.

A major problem in aircraft operation is returning to base to find conditions of low cloud, poor visibility, or both. To a certain extent FLIR will help because a sensor working in the far-IR at wavelengths near 10µm can penetrate poor visibility and even a small amount of cloud. However, a much better picture can be produced for pilots using GPS position combined with stored synthetic imagery of the local terrain that can be called up when required and matched to the GPS latitude and longitude as it changes with time. Accurate aircraft altitude is required for such a system to be safe and a combination of pressure and GPS altitude and a good model of local terrain and obstructions can provide this data. Synthetic imagery can be displayed either on an HUD or on an HMD. Such systems have been trialled but there are obvious problems in certification for live use for landings, particularly in the Commercial Air Transport sector. GPS co-ordinates used must be as accurate as possible, so a Satellite-Based Augmentation Systems (SBAS) for the area should be used such as BeiDou 1 (China), EGNOS (Europe), GAGAN (India), MSAS and QZSS (Japan), and  WAAS (North America). Clearly such synthetic visualisation systems can be used for an aircraft approach down to altitudes of 100, even 50m, but what if the real runway still does not appear through the synthetic picture? Will they ever be cleared for the landing itself, or for taxying in fog? With such a system, taxying is likely to be more hazardous than landing, because Air Traffic Control will (probably) be able to guarantee that the landing runway is clear of other aircraft, but taxiways and dispersal areas are another matter entirely!

This update of visualisation systems has covered some of their technology and current applications. CGI is now available with nearly real-world resolution, but that is only the start. The question is not so much the imagery, but how it is displayed. Displays vary from simple TV screens, through forward- and back-projected displays, to those at a distant focus. The distant focus systems apply to large and capable FFS, and at a smaller size, aircraft HUD/HMD. Then there is the use of visualisation systems in real vehicles rather than in simulators, such as for night vision, targeting, or as a landing aid in poor visibility. Overall, modern visualisation systems are in use, are very capable, and contribute in a major way to training both civil and military.

Ian W. Strachan is an expert on simulation and training and a regular contributor to MT.
Peter Donaldson, with 25 years of experience as a journalist and writer covering aerospace and defence technology and operations, is a regular contributor to MT.