Friday, June 29, 2012

Researchers Accomplish GPS Spoofing of Domestic UAS

It appears the theory that the Iranians used GPS spoofing techniques to crash and acquire a U.S. RQ-170 Sentinel UAS (Rawnsley, 2011), just recieved some additional support. A team of students from the University of Texas Austin, with the cooperation of the Department of Homeland Security, succesfully gained control of a small rotocraft UAS (Lecher, 2012) using spoofing. This coordination between researchers and the Government exposes an operational design flaw that will need to be resolved before the widespread integration of UAS into the domestic airspace.

Update 30 June 2012: BAE has announced the development of a new navigation system, navigation via signals of opportunity (NAVSOP) radio positioning to supplement existing GPS and mitigate potential spoofing efforts (Holloway, 2012). NAVSOP will rely on existing radio infrastructure and be usable in locations where GPS currently cannot (i.e., polar Earth positions and inside buildings; Holloway, 2012).

Update 6 July 2012:This experiment is the first proof that UAS can be spoofed to change their position using automated reaction (Franceschi-Bicchierai, 2012).

REFERENCES
Holloway, J. (2012, June). BAE takes on GPS with NAVSOP radio positioning system. Gizmag.com. Retrieved from http://www.gizmag.com/bae-navsop-radio-positioning/23137/

Franceschi-Bicchierai, L. (2012, July). Drone hijacking? That's just the start of GPS troubles. Popular Science. Retrieved from http://www.wired.com/dangerroom/2012/07/drone-hijacking/

Lecher, C. (2012, June). Texas students hijack a U.S. Government drone in midair. Popular Science. Retrieved from http://www.popsci.com/technology/article/2012-06/researchers-hack-government-drone-1000-parts

Rawnsley, A. (2011, December). Iran's alleged drone hack: Tough, but possible. Wired. Retrieved from http://www.wired.com/dangerroom/2011/12/iran-drone-hack-gps/

Thursday, June 28, 2012

Emergence of Degrees In Unmanned Aircraft Specialization

The appearance of new degree specializations in unmanned aircraft systems corresponds with the recently announced growth forecasts of unmanned markets (see related blog post; Sanford, 2012; Zaloga & Rockwell, 2011). Liberty University's School of Aeronautics just joined the ranks of Kansas State University Salina, the University of North Dakota, Embry-Riddle Aeronautical University, and Northwestern Michigan College (Williams, 2012). The development of these programs by Universities with ties to industry, bodes well for future research and career opportunities.

My Alma mater, Embry-Riddle Aeronautical University, started their unmanned program as a minor in the fall of 2010 (Embry-Riddle Aeronautical University, 2010; Embry-Riddle Aeronautical University, 2012b) and are preparing to offer their first related B.S. major in Unmanned Aircraft System Science this fall (Embry-Riddle Aeronautical University, 2012). Their program will feature two specialization concentrations, a pilot and a non-pilot track (Embry-Riddle Aeronautical University, 2012). The pilot track will be focused on training students to meet Federal Aviation Administration (FAA) operational requirements for public and restricted airspace (Embry-Riddle Aeronautical University, 2012). The direction of the non-pilot track will be placed on support roles, such as sensor or payload operation (Embry-Riddle Aeronautical University, 2012). At this time, the degree is only available to U.S. citizens to meet International Traffic in Arms Regulation (ITAR) rulings (Embry-Riddle Aeronautical University, 2012).

18 July Update: Hinds Community College in Raymond, MS has announced four courses for UAS pilot training starting this August (College to offer unmanned aerial vehicle training, 2012).

REFERENCES

College to offer unmanned aerial vehicles training. (2012, July).Fox40tv.com. Retrieved from http://www.fox40tv.com/news/local/story/College-to-offer-unmanned-aerial-vehicles-training/fS7Sz9AAT0OY11_PvXP5Cw.cspx

Embry-Riddle Aeronautical University. (2010, Feb). Embry-Riddle to train unmanned aircraft pilots [News Release]. Daytona Beach, FL: Embry-Riddle Aeronautical University. Retrieved from http://www.erau.edu/er/newsmedia/newsreleases/2010/uasminor.html

Embry-Riddle Aeronautical University. (2012a). Bachelor of Science in Unmanned Aircraft Systems Science [Program Description]. Daytona Beach, FL: Embry-Riddle Aeronautical University. Retrieved from http://daytonabeach.erau.edu/coa/aeronautical-science/undergraduate-degree/unmanned-aircraft-systems-science/index.html

Embry-Riddle Aeronautical University. (2012b). Minor in Unmanned Aircraft Systems Applications [Program Description]. Daytona Beach, FL: Embry-Riddle Aeronautical University. Retrieved from http://daytonabeach.erau.edu/degrees/minors/unmanned-aircraft-systems/index.html%20

Sanford, B. (2012, June). Unmanned aircraft systems market to hit $51 billion by 2018 says report. ReportBuyer. Retrieved from http://meshpress.com/unmanned-aircraft-systems-market-to-hit-51-billion-by-2018-says-report/123854.html%20

Williams, B.J. (2012, June). New unmanned aerial systems specialization jumpstarting careers. Lynchburg, VA: Liberty University. Retrieved from http://www.liberty.edu/news/index.cfm?PID=18495&MID=58003

Zaloga, S., & Rockwell, D. (2011). UAV market set for 10 years of growth. Earth
Imaging Journal
. Retrieved from http://eijournal.com/uncategorized/uav-market-set-for-10-years

Wednesday, June 27, 2012

Continued Growth of the Unmanned Aircraft Market


http://www.fas.org/man/dod-101/sys/ac/ucav_a1.jpg

A recently released report, "Unmanned Aerial Systems (UAS): Market Shares, Strategy, and Forecasts, Worldwide, 2012 to 2018," indicates continued growth of the unmanned aircraft market over the next 15 years (Sanford, 2012).  The financial growth is expected to reach $51 billion, which is aligned with past forecasts of U.S. market growth over the next decade from $5.9 billion per year to $11.3 billion based on military needs (Zaloga & Rockwell, 2011).  The 910 page report can be purchased from Global Information, Inc. (2012) for $3700 (single user license) or $7400 (website posting license), here. The report covers unmanned aerial systems (UAS) market descriptions, market dynamics, market shares, market forecasts, product descriptions, technology, and company profiles (WinterGreen Research, Inc., 2012).  This forecast is exciting news for those interested in unmanned research and development. [Image retrieved from http://www.fas.org/man/dod-101/sys/ac/ucav_a1.jpg, copyright Federation of American Scientists, 2011]. 

REFERENCES
Federation of American Scientists. (2011). X-45 Unmanned Combat Air Vehicle (UCAV): Overview [Program description]. Retrieved from http://www.fas.org/programs/ssp/man/uswpns/air/attack/x-45_ucav.html 

Global Information, Inc. (2012). Market Research Report. Retrieved from http://www.giiresearch.com/report/wg245633-unmanned-aerial-systems-uas-market-shares-strategy.html
 
Sanford, B. (2012, June). Unmanned aircraft systems market to hit $51 billion by 2018 says report. ReportBuyer. Retrieved from http://meshpress.com/unmanned-aircraft-systems-market-to-hit-51-billion-by-2018-says-report/123854.html

WinterGreen Research, Inc. (2012). Unmanned Aerial Systems (UAS): Market Shares, Strategy, and Forecasts, Worldwide, 2012 to 2018.

Zaloga, S., & Rockwell, D. (2011). UAV market set for 10 years of growth. Earth
Imaging Journal
. Retrieved from http://eijournal.com/uncategorized/uav-market-set-for-10-years

Tuesday, June 26, 2012

Dogfighting SUAVs

I found an interesting article, Drone dogfights by 2015? U.S. Navy preps for futuristic combat, containing a description of using small unmanned aerial vehicles (SUAV)s, in a swarming mode, to determine effective methods to multiply military advantage through combat interactions.  The Naval Postgraduate School - Advanced Robotics Systems Engineering Lab (ARSENL), led by Timothy Chung, will be working to coordinate control and interaction of their Unicorn UAV (Procerus Technologies, 2012). I'm curious to see how a platform like this R/C F-35 Lightening II 360 degree thrust vectoring jet (due out in late August; Hobby-Lobby, 2012b) would stack in aerial dogfighting. I suspect that such a system could provide significant performance gains over the Unicorn aircraft. Take a look at this video featuring the R/C JSF in flight (specifically at the 0:35 and 1:18 marks; Hobby-Lobby, 2012a). Thrust vectoring could provide additional benefits outside of dogfighting including, simplified handling/control, low-speed target tracking/following, and increased environmental reaction capability.

REFERENCES
Hobby-Lobby (Producer). (2012a). F-35 jet with thrust vectoring![Video]. Available from http://www.youtube.com/watch?v=a-4ZA5J3ZGs&feature=player_embedded

Hobby-Lobby. (2012b). R/C F-35 Lightening II 360 degree thrust vectoring jet [Product description]. Retrieved from http://www.hobby-lobby.com/f_35_lightning_360_degree_thrust_vectoring_jet.htm

Procerus Technologies. (2012).UAV test platform [Product description]. Retrieved from http://www.procerusuav.com/productsZagiTestAirframe.php

Terdiman, D. (2012, June). Drone dogfights by 2015? U.S. Navy preps for futuristic combat. C/net.com. Retrieved from http://news.cnet.com/8301-13576_3-57457501-315/drone-dogfights-by-2015-u.s-navy-preps-for-futuristic-combat/

Monday, June 25, 2012

First Academic Citation of My Work

I stumbled upon the first academic citation of my work, it was made while I was performing data collection for my Ph.D. dissertation research. The author, Jefferson Romney a student at Embry-Riddle Aeronautical University, contacted me for my professional opinion of using unmanned aircraft in an air superiority role. Specifically, identifying if situational awareness was a major factor inhibiting the advancement of unmanned capabilities, the possibility of creating an unmanned vehicle capable of air superiority, platform effectiveness, identification of critical technology, role of autonomy, and methods to make the ground control station effective for air-to-air operations.  My responses and a reference to my dissertation work were feature in his report, Feasibility of using unmanned aerial vehicles as an air superiority platform (Romney, 2011). See the individual citations (Terwilliger, 2011) and the INTERVIEW TRANSCRIPT at the end of the report for additional information.

REFERENCES
Romney, J. (2011). Feasibility of using unmanned aerial vehicles as an air superiority platform. Daytona Beach, FL: Embry-Riddle Aeronautical University. Retrieved from http://www.cosmicscribbler.com/eraU/Resources/FeasibilityUAV.pdf

Terwilliger, B. (2011). Examining effects of visual interaction methods on unmanned aircraft operator situational awareness [Unpublished doctoral dissertation proposal]. Prescott Valley, AZ: Northcentral University. Retrieved from https://sites.google.com/site/etprepository/repository/TerwilligerB_DP_2.pdf


Friday, June 22, 2012

Leap Forward in Battery Technology

The recent announcement that the intentional growth of spurious "tin whiskers" (Sampson, Leidecker, Brusse, & Kim, 2009) in lithium-ion batteries could triple capacity, represents a major step forward in battery technology (Coxworth, 2012). The implications of this announcement extend well beyond increasing the operational duration of our hand held electronics. The associated developmental growth could lead to the feasibility of commercial electric flight, significantly expanded operational range or duration for electric/hybrid unmanned systems, expanded clean energy harvesting, and efficient electric mobility enhancement devices (primary limitation of powered exoskeletal research has been power sourcing).

REFERENCES
Coxworth, B. (2012, June). "Tin whiskers" could triple the capacity of lithium-ion batteries. gizmag.com. Retrieved from http://www.gizmag.com/tin-whisker-battery-anode/22905/

Sampson, M., Leidecker, H., Brusse, J., & Kim, J. (2009). Basic info on Tin Whiskers. National Aeronautics and Space Administration. Retrieved from http://nepp.nasa.gov/whisker/background/index.htm

Thursday, June 21, 2012

About Situational Awareness

 “Situational Awareness (SA) refers to the degree of accuracy by which one's perception of his current environment mirrors reality.” (Naval Aviation Schools Command, 2012)

SA is a critical component in the effective decision making process, containing the data necessary to comprehend an environment (Endsley, 1988). SA, is composed of three components (i.e., levels), which include: (a) perception, (b) comprehension, and (c) projection (Endsley, 1988).   SA originates with the perception of environmental elements, using displays or personal senses (Endsley & Connors, 2008).  If any of these components are diminished (i.e., perception or comprehension), the accuracy of the following component (i.e., comprehension or projection) will also be diminished. 

SA reflects the amount of data a user has and is able to interpret at any given moment in time. Improving a user’s SA, while operating a vehicle, results in an overall improvement in performance, potentially decreases risk to the operational hardware, and increases operational efficiency.  Limiting an operator’s SA through diminished sensory input, results in a less accurate model of the operating environment. Less accuracy of an environment model reflects reduced or incorrect SA, which increases the potential for mishap or accidents (Hing, Sevcik, & Oh, 2009; Menda, Hing, Ayaz, Shewokis, Izzetoglu, Onaral, & Oh, 2011).  

REFERENCES
Endsley, M.R. (1988). Situational awareness global assessment technique (SAGAT). Proceedings of the IEEE 1988 National Aerospace and Electronics Conference, Dayton, OH, 3, 789 - 795. doi:10.1109/NAECON.1988.195097

Endsley, M.R., & Connors, E.S. (2008, August 12). Situation awareness: State of the art. 2008 IEEE Power and Energy Society General Meeting - Conversion and Delivery of Electrical Energy in the 21st Century, Pittsburgh, PA, 1-4. doi:10.1109/PES.2008.4596937

Hing, J., Sevcik, K., & Oh, P. (2010). Development and evaluation of a chase view for UAV operations in cluttered environments. Journal of Intelligent and Robotic Systems, 57, 485-503.

Menda, J., Hing, J.T., Ayaz, H., Shewokis, P.A., Izzetoglu, K., Onaral, B. & Oh, P. (2011). Optical brain imaging to enhance UAV operator training, evaluation, and interface development. Journal of Intelligent and Robotic Systems, 61, 423–443. doi 10.1007/s10846-010-9507-7

Naval Aviation School Command (2012). Situational awareness. Retrieved from https://www.netc.navy.mil/nascweb/crm/standmat/seven_skills/SA.htm



Tuesday, June 19, 2012

Unmanned Aircraft Testing in Florida

It appears Florida might be the next area for unmanned development and testing growth with the announcement of interest in the state pursuing selection as of one of six potential testing sites in December. Read more in the following articles:

Dean, J. (2012, June). Florida hopes to fill its skies with unmanned aircraft. USA Today. Retrieved from http://www.freep.com/usatoday/article/55783066?odyssey=mod|newswell|text|FRONTPAGE|p

Tracy, D. (2012, June). Drones could soon be flying in Florida skies. OrlandoSentinel.com. Retrieved from http://articles.orlandosentinel.com/2012-06-03/news/os-unmanned-drones-florida-future-20120603_1_drone-industry-faa-law-space-florida

Examples of Custom Developed Applications/Systems

In support of my research activities I have developed several applications and systems using commercially-off-the-shelf (COTS) and custom developed components. For a *.pdf version of this page, click here...

The following is a brief overview of each of these  applications and systems.

Simulated Unmanned Operator Test Station 
The simulated operator station was a custom developed system for experimental testing to capture the SA of each participant. The appearance and function of this system approximated the look, feel, and interaction of a generic GCS.

The system hardware was composed of several components that facilitated test participant use of the experimental visual interaction methods (i.e., static eyepoint, analog joystick, head tracker, uninterrupted hat/POV switch, and incremental hat/POV switch). The cost of the system hardware was $875.


The simulated operator station software was composed of several components that facilitated test participant use of the experimental visual interaction methods (i.e., static eyepoint, analog joystick, head tracker, uninterrupted hat/POV switch, and incremental hat/POV switch). The primary software components were the head tracker/virtual joystick application and the visual SA test application, described below.



Head Tracker/Virtual Joystick Application -
This application was designed to capture and process inputs from a custom created single-IR LED head tracker. The software captured the head tracker input values and converted them to values that correlated to the analog X and Y axes of a joystick. The converted values were then packaged and output as a virtual joystick, recognizable for use by external programs. To simulate a virtual joystick, this component interfaced to the PPJoy application, which had developmental virtual COM joystick interfacing capability. The head tracker/virtual joystick application was an interface between the head tracker and the visual SA test application. This component also featured several capabilities not applicable to the experimental testing: ability to display a heads up display (HUD) and an interface for manual control.


The head tracker/virtual joystick application was designed to run on a Windows XP system, using the DirectX 9.0c, Wiimote, and PPJoy support libraries to accept input from a Nintendo Wii remote device over a bluetooth connection established through the Toshiba bluetooth stack and USB bluetooth adapter. Once the Wii remote was configured for operation and interfaced to the Windows PC, it was able to track the position of an IR LED within the field of view (FOV) of the Wii remote device camera (i.e., head tracker camera). The captured IR LED position was mapped to a two dimensional area (X and Y axes) using the Wiimote library components and translated into a pitch (rotation about head’s Y axis) and yaw (rotation about head’s Z axis) value. The application has several unique features:
  • two-dimensional rotational angle calculations (used Pythagorean geometry)
  • tuning parameters (factor, response curve, and initial calibration)

Visual Situational Awareness (SA) Test Application -
This application was designed to capture and process user inputs for the control of the visual eyepoint and displayed the egocentric visual environment, dynamic depiction of geometric objects, and movement of the visible screen area within the visible environment area. The inputs included the head tracker and joystick. There were two primary components to the visual SA test application, the controls/settings window and the interaction controls. Not all of the controls contained in the GUI were necessary for the experimental testing. Additional controls were necessary for debugging and component evaluation in the application development process.


The visual SA test application was designed to run on a Windows XP system with a video resolution of 800 x 600, using the DirectX 9.0c support library to accept input from joystick devices or through user interaction with the application GUI. The software application used the DirectX 9.0c library to connect to and read the state of the joystick, including axes position and button depression. The software was designed to run the SA experimental test and the associated alignment and familiarization activities.


Tactile Aural Simulated Kinetics (TASK) Rover and Ground Control Station (GCS)
The predecessor of my Ph.D. research to explore the impact of dynamic eyepoint manipulation originated from development and refinement of an earlier concept to replicate the operating environment of a custom built remotely operated vehicle (ROV). This conceptual research was termed Tactile Aural Simulated Kinetics (TASK). Eventually the lessons learned from the development, implementation, and use of the TASK research formed the basis of the recent research, including:
  • Mitigating the effects of spatial disorientation
  • Identification of dynamic eyepoint interaction techniques
  • Effect of visual interaction on operator SA
The purpose of TASK was to mitigate the FACTORS THAT REDUCE SITUATIONAL AWARENESS through the use of simulated motion/force feedback and aural/sound cues, increasing an operator’s performance and abilities. The incorporation of tactile and aural data was believed to provide an operator with additional means to interpret data through methods they are already familiar with and can react to faster.  The TASK concept provided a means to effectively capture and provide the ROV operator with “live” simulated tactile feedback and aural data.



TIGER Vehicle - The Tactile Information Gathering Environmental Remote (TIGER) vehicle was controlled by the ground control system (GCS) and relayed captured data (visual, aural, and motion) back.


TASK GCS - User interface that simulated the “Look, feel and sound” of the remote vehicle operating environment. Provided control instructions and data presentation for the TIGER including visual orientation, aural playback (simulated sound) and vibration simulation.

For additional information regarding the TASK system, click this link.

Teleoperation Development Kit (TDK)
The TDK is a customizable/reconfigurable remote servo control system (hardware and software) that replaces the need to design and build the software/hardware of an unmanned vehicle control layer. TDK provides a control solution to unmanned vehicle developers or remote control (R/C) hobbyists with a control radius requirement of 5 miles or less.








Example platforms include:
  • Micro Unmanned Aerial Vehicle (MUAV)
  • Small Unmanned Aerial Vehicle (SUAV)
  • Unmanned Ground Vehicle (UGV)
  • R/C aircraft
  • R/C cars, trucks, and tanks
  • R/C boats
The TDK replaces the to need to design and build the software and hardware control layer, providing a user with additional time and resources to focus on the development of the actual vehicle platform.
 
Use of the TDK system would:
  • allow individuals unfamiliar with software or hardware development to create an unmanned research, development, or test platform
  • facilitate rapid development of complex vehicle controls that provide higher fidelity/increased capabilities
  • provide flexibility through reconfigurable design (reuse for other platforms, activities, or requirements)
  • increase the accuracy of first person view (FPV) remote control systems
  • increase the fidelity of simulate R/C aircraft controls (i.e., use of actual flight sticks/yolks, throttle controls, pedals, and custom developed user interfaces).
For additional information regarding the use and configuration of the TDK software, click this link.

Generic Rotorcraft Flight Dynamics Model (FDM)
This application was developed to create a generic rotocraft FDM for comparative analysis purposes. It uses a USB joystick or on screen (i.e., embedded) flight controls (cyclic, collective, pedals, and three stage throttle/engine control unit (ECU) as user inputs to the model. Available airframes include the AS350 Squirrel, SA330 Puma, AH-64/D, EH101 Merlin, and CH-47 Chinook. However, new configurations/airframes can be added by editing a *.xml data file and adding values for the number of rotors, main rotor diameter, tail rotor diameter, total engine horsepower, maximum operating altitude, and the airframe weight. In addition, sling loads can be added to the model and their characteristics (i.e., weight and cable length) modified during operation. The flight dynamics calculations of the FDM take ground effect region, weights, rotor sizing, engine output, and sling loading into consideration. The following are the outputs of the FDM, available through a shared memory interface:
  • Vertical Speed
  • Altitude
  • Speed
  • Rotor Torque
  • Airframe Acceleration
  • Airframe Velocities
  • Airframe Rotations
  • Latitude/longitude position


Directory Reader/Writer
This simple application was developed to provide a search tool to locate specific files or file types in a folder/directory and write the information into a user specified log file (*.txt format).








FlightGear UDP Generic Interface
This  application was developed to interface with the FlightGear simulation software using user datagram protocol (UDP). It featured a shared memory interface that could link to the Generic Rotorcraft FDM application to pass through information to use the FlightGear application as an image generator (IG).

On Screen Dialog (OSD)
This  application was developed to display a OSD/heads-up-display (HUD) reticle, that is editable by the user. The editable features include (transparency level, color, and ON/OFF state). This application was originally developed to provide an reticle overlay for camera data captured from the operation of an unmanned vehicle (i.e., FPV camera).





Proposed Followup Unmanned SA Research

After the completion of my initial investigation into the effects of dynamic visual interaction on Situational Awareness (SA) for stationary egocentric environments, I determined that additional research would be required. My investigation was limited to a stationary position, such as in a taxiway hold, which is not all inclusive of unmanned aircraft operations. To determine if the effect continues for all aspects of operation, dynamic positioning (i.e., taxiing, takeoff, landing, flight, target engagement, etc...) would require examination using the same methods.

Who should followup research be performed?
Pilots/operators have identified multiple problems (preferences and performance issues) with use of current GCS designs:
• control interfaces of unmanned aircraft are significantly different from manned
aircraft (lack of familiarity).
• exhibited desire for control systems that mirror an F-16 or F-15, which allow the
deployment of a weapon with a single switch as opposed to multiple mouse clicks
associated with drop down menus
• describe existing designs as employing “arcane and exhausting pilot-machine interface”








Researchers have identified issues requiring a redesign of existing GCS designs:
• current designs employ obsolete technology/poor design in equipment (prevents providing of optimal support)
• existing interfaces create limitations for flight and combat operations
• the USAF has received a recommendation to evaluate and optimize the human system
interface of unmanned aircraft Ground Control Stations
• variation between existing GCSs also represents a human factors concern requiring examination to generate regulation for domestic operations
• identified the need for the research originated from a necessity to reduce constraints of human operators and the contemporary technology used in the system







Benefits of performing followup research
The use of currently captured research data in conjunction with appropriate follow-on
research would facilitate the following:
• create opportunities to redesign/upgrade existing GCSs
• develop next-generation GCS using latest generation communication, display, avionics, and simulation technology
• develop new GCS design standard/provide input towards the creation of a standard
• expand the market for further application of embedded training (in actual GCS)
• create opportunities for partnerships in unmanned development (sensor developers, system integrators, academia, government agencies, etc…)

Phased research approach
Based on observation and analysis of the recent research data, it is clear there is a need for further follow-up research to examine the effect a dynamic eyepoint would have on operator SA in a dynamic environment (i.e., taxiing, takeoff, waypoint following, terrain following, target identification/engagement, enemy fire avoidance, and landing) while subject to attentional allocation. A definitive understanding of unmanned SA could be accomplished through a multiple phase research approach:

Phase 1 – Stationary Location (recently completed dissertation work) – Identified that there was a relationship between increased SA and dynamic eyepoint using unskilled participants
Phase 2 -Dynamic Location– Determine if any correlations identified in Phase 1 research still holds
true for dynamic movement (i.e., takeoff, landing, flight, etc…) using unskilled operators
Phase 3 –Piloted Operation– Determine if any prior correlations still remain true for a trained operator while under attentional load associated with typical operation.
Phase n –Further Resolution – Perform any necessary finer resolution research


Proposed Activities Durations
The following represents the proposed activities for the proposed research:
Phase 1 (COMPLETE) – Initial investigation into use of dynamic eyepoint versus static for stationary position
Phase 2 (dynamic location) - approx 12 month duration
• develop test system (custom software and integration)
• develop and test framework (SAGAT queries, simulated database environment)
• perform experimental testing activities (unskilled participants)
• develop technical whitepaper(s) detailing findings
Phase 3 (piloted operation) – approx 12 month duration
• update test system (incorporation of simulated pilot duties/responsibilities)
• update of test framework (SAGAT queries, simulated database environment)
• perform experimental testing activities (skilled participants)
• develop technical whitepaper(s) detailing findings
Phase n (further resolution) – as needed


Implementation
Follow-up research activities could be carried out at a minimum equipment cost through use of low-cost COTS equipment:
• PC Image Generator (PCIG)
• PCIG Software (runtime and ATP applications)
• PCIG Database(s)
• Windows XP/7 PC
• Head tracker
• Hands-on-throttle-and stick (HOTAS) joystick
• USB gamepad/button interface







Advanced Technology Demonstration System
After the follow-on research is performed, the captured data could be used to develop an advanced technology demonstration system:
• incorporation of new visual interaction techniques (if data shows correlation)
• incorporation of simulation technology (augmented reality-image generation, aural reproduction, tactile reproduction, high-fidelity displays, etc…)
• incorporation of latest avionics and communication products (synthetic vision, data links/delivery systems, autopilots, controls, etc…)
• design changes consistent with identified pilot/operator preferences and user requirements


Available Resources/Partners
I am positioned to take advantage of close geographic proximity to a large group of potential academic and governmental collaborators in the Central Florida area:
Embry-Riddle Aeronautical University
National Aeronautics and Space Administration (NASA) – Kennedy Space Center
University of Central Florida (including Institute of Simulation and Training)
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)
U.S. Army Program Executive Office for Simulation, Training, and Instrumentation (PEO STRI) (including Simulation and Training Technology Center)
Naval Air Systems Command (NAV AIR)
Air Force Agency for Modeling and Simulation (AF AMS)
National Center for Simulation (NCS)
Team Orlando (Improving Human Performance Through Simulation)


If you would like to read more, including background development work and recent findings (i.e. my Ph.D. dissertation work), click the following links:

Research Proposal Presentation
Proposal to Study Effects of Visual Interaction on Unmanned Aircraft Operator Situational Awareness in a Dynamic Simulated Environment, [presentation] in .PDF format   

Research Proposal Paper
Proposal to Study Effects of Visual Interaction on Unmanned Aircraft Operator Situational Awareness in a Dynamic Simulated Environment, [adobe reader document] in .PDF format

Monday, June 18, 2012

Visual Interaction Situational Awareness Modifiers - Unmanned Operation

I recently completed a Ph.D. research study to examine and compare the effects of dynamic visual interaction to conventional static interaction on unmanned operator Situational Awareness (SA).  This study was based on the premise that the amount of visual information available could affect the SA of an operator and that increasing visual information through dynamic eyepoint manipulation (i.e., moveable camera) may result in higher SA than static visualization (i.e., body-fixed camera). Using a simple environment simulation and experimentation, four experimental dynamic visual interaction methods were examined (analog joystick, head tracker, uninterrupted hat/point of view switch, and incremental hat/point of view switch) and compared to a single static method (the control treatment), described below:

Unmanned Operator Station (Simulation) - The simulated operator station was a custom developed system (hardware and software) used for experimental testing to capture the SA of each participant.  The appearance and function of this system approximated the look, feel, and interaction of a generic ground control station (GCS).


Static Visual Interaction- The focus of this method was on replicating current unmanned aircraft visual interaction functionality, where the eyepoint would remain fixed within the simulated environment and equal to the field of view (FOV) of the simulated camera (i.e., could only observe center of the simulated environment).


 
Analog Joystick Dynamic Visual Interaction - This method was used to control the simulated eyepoint (camera) of the visual display using custom developed software and a USB joystick device, which  captured and translated analog X and Y axes input into eyepoint movement in the visual simulation.
 
Head Tracker Dynamic Visual Interaction- This method was used to control the simulated eyepoint of the visual display using a custom developed head tracker system (hardware and software), which captured and translated rotational head movements (pitch/yaw) into eyepoint movement in the visual simulation.

Incremental Hat/Point of View (POV) Switch Dynamic Visual Interaction - Controlled the simulated eyepoint of the visual display using the eight-directional hat/POV switch of a USB joystick and custom developed software, which captured and translated user switch input into incremental (up, down, left, and/or right) visual change in the eyepoint position based on previous positioning and a predetermined increment rate (i.e., 50 pixels per second).

Uninterrupted Hat/POV Switch Dynamic Visual Interaction- Controlled the simulated eyepoint of the visual display using the eight-directional hat/POV switch of a USB joystick and custom developed software, which captured and translated user switch input into sweeping (i.e., uninterrupted) visual change in the eyepoint position.


These five methods were used in experimental testing with 150 participants (N = 150; n = 30 per treatment) to determine the use of a dynamic eyepoint significantly increased the SA score (0 to 100%) of a user within a stationary egocentric environment (see following graph), indicating that employing dynamic control would reduce the occurrence or consequences of the soda straw effect.


I'm currently exploring options to perform followup research to determine if the effects of dynamic eyepoint manipulation continue to remain true for use in a dynamic setting (i.e., aircraft in flight, landing, takeoff, or target engagement).

If you are interested in potential collaboration or have any questions, please feel free to read this post,
download my dissertation defense presentation or follow on research proposal, or contact me.

Ph.D. Dissertation Abstract - Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator Situational Awareness

The following is the abstract from my dissertation, Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator Situational Awareness (soon to be available on ProQuest). If you would like to learn more please feel to contact me.

The limited field of view of static egocentric visual displays employed in unmanned aircraft controls introduces the soda straw effect on operators, which significantly affects their ability to capture and maintain situational awareness by not depicting peripheral visual data. The problem with insufficient operator situational awareness is the resulting increased potential for error and oversight during operation of unmanned aircraft, leading to accidents and mishaps costing United States taxpayers between $4 million to $54 million per year. The purpose of this quantitative experimental completely randomized design study was to examine and compare use of dynamic eyepoint to static visual interaction in a simulated stationary egocentric environment to determine which, if any, resulted in higher situational awareness. The theoretical framework for the study established the premise that the amount of available visual information could affect the situational awareness of an operator and increasing visual information through dynamic eyepoint manipulation may result in higher situational awareness than static visualization. Four experimental dynamic interaction methods were examined (analog joystick, head tracker, uninterrupted hat/point of view switch, and incremental hat/point of view switch) and compared to a single static method (the control treatment). The five methods were used in experimental testing with 150 participants to determine a mean situational awareness score for each method. One way analysis of variance and a post hoc Scheffe test were used to determine the existence of statistical significance and ranking of results. The results indicated mean situational awareness scores associated with all four of the dynamic visual interaction methods were significantly higher than the static method, F (4, 145). The primary difference between the four dynamic visual interaction methods was their unique manipulation approaches to control the orientation of the simulated eyepoint. The use of a dynamic eyepoint significantly increased the situational awareness of a user within a stationary egocentric environment, indicating that employing dynamic control would reduce the occurrence or consequences of the soda straw effect. Further research is necessary to determine if the results of this study are also true for a moving egocentric environment, which is subject to differing dependencies and external variables.

I successfully defended this dissertation and was conferred the title of Ph.D. in Business Administration (with specialization in Aviation) on Friday, 27 April 2012.

Here is a copy (in *.pdf format) of my dissertation defense presentation.

Potential Interests for Aviation Transportation Research

I have recently begun thinking about several areas of future research concerning transportation:

1) Examining the viability of electric or hybrid propulsion systems in General Aviation (GA) aircraft and identification of the necessary support infrastructure. Specifically, comparison and determination of scenarios/operating conditions that result in maximum effectiveness (i.e., short-local (city/county), medium-regional (state/region), long-continental (country/domestic), and extended-transcontinental) given state of current technology. This concept could provide a baseline for scalability up to Commercial operations. Additionally, a list of positive and negative attributes could be identified to support further decision making, including:
  • Positive (reduced noise, common fuel/power standard, reduced emissions, and reduced mechanical complexity)
  • Negative (disposal of greater consumables such as depleted batteries, new infrastructure investment, and performance of extensive safety analyses on technology)

2) Comparison of various transportation loading methodologies (multi-modal; trains, aircraft, ships, buses, etc...) throughout the world and identification of optimized method(s) to meet anticipated future needs (low stress passenger experience, security, increased use, and multi-modal interchanges).

3) Economic examination of passenger support/experiential enhancement business locations within transportation hubs (placement before versus after security, maximum passenger exposure versus limited competition, etc...).

4) Use of unmanned systems for commercial cargo delivery (fixed routes and delivery schedules) within the planned 2012 to 2025 FAA Next Generation (NextGen) Air Transportation System, including:
  • Specific use of Automatic Dependent Surveillance Broadcast (ADS-B)/Global Positioning System (GPS) for real-time route-planning/correction, location, separation, and traffic data.
  • Development of new/improved decision making algorithms using System Wide Information Management (SWIM) and NextGen Network Enabled Weather (NNEW) technology.
  • Identification of components necessary for future incorporation into European and Asian airspace. 
5) Re-examination of my graduate research project, Cost and Performance Analysis of Internal Combustion (IC) Engines Versus Electric Motors for use as Unmanned Aerial Vehicle (UAV) Propulsion Systems, with a focus on use of the latest generation in electric motors (brushless), power management, and batteries (lithium-polymer/ion).


If you are interested in potential collaboration or have any questions, please feel free to contact me.

My Resume/Curriculum Vitae

For simplicity, I have opted to include a link to my LinkedIn Profile, which contains all the details of my experience, responsibilities, roles, projects, education, awards, and activities. Please see the following for my list of published and presented works: http://brentterwilliger.blogspot.com/2015/02/publications-presentations-videos-and.html


Initial Post

As my initial post I would like to take the opportunity to describe how I plan to use this blog:

1) Advertisement of my experience, education, and credentials in the pursuit of potential research collaboration or personal growth opportunities. As such, I will be adding a dedicated post containing the details of my resume/curriculum vitae (CV).

2) Identify specific research concepts or areas of interest.

3) Provide reviews of others work or ideas that I feel are closely aligned to my own areas of interest.

4) Promote a forum for the free exchange of ideas.

-Brent A. Terwilliger, Ph.D.