SOLICITATION NOTICE
49 -- Electro-Optical Infrared Direct Injection (EOIRDI) KEOS/SKIP scene generation systems.
- Notice Date
- 2/1/2022 11:58:35 AM
- Notice Type
- Presolicitation
- NAICS
- 334111
— Electronic Computer Manufacturing
- Contracting Office
- NAVAL AIR WARFARE CENTER AIR DIV PATUXENT RIVER MD 20670-1545 USA
- ZIP Code
- 20670-1545
- Solicitation Number
- EOIR-DI
- Response Due
- 2/16/2022 12:00:00 PM
- Point of Contact
- Christin J Simpson, Basirat Shonekan-umaru
- E-Mail Address
-
christin.simpson@navy.mil, basirat.umaru@navy.mil
(christin.simpson@navy.mil, basirat.umaru@navy.mil)
- Description
- INTRODUCTION The Integrated Battlespace Modeling and Simulation Department at the Naval Air Warfare Center - Aircraft Division, Patuxent River, MD anticipates to enter into a sole-source Basic Ordering Agreement (BOA) with Amherst Systems, a Northrup Grumman subsidiary, herein NG Amherst (CAGE 1L4J7), of 1740 Wehrle Drive, Buffalo, New York 14221 for the procurement, integration, and engineering support of Electro-Optical Infrared Direct Injection (EOIRDI) KEOS/SKIP scene generation systems.� This BOA and all resultant orders will be issued using the statutory authority at 10 U.S.C. 2304(c)(l), Only one responsible source and no other supplies or services will satisfy agency requirements. DISCLAIMER THIS PRE-SOLICITATION NOTICE IS NOT A REQUEST FOR PROPOSAL IT DOES NOT CONSTITUTE A SOLICITATION AND SHALL NOT BE CONSTRUED AS A COMMITMENT BY THE GOVERNMENT. RESPONSES IN ANY FORM ARE NOT OFFERS AND THE GOVERNMENT IS UNDER NO OBLIGATION TO AWARD A CONTRACT AS A RESULT OF THIS PRE-SOLICITATION NOTICE. NO FUNDS ARE AVAILABLE TO PAY FOR PREPARATION OF RESPONSES TOTHIS ANNOUNCEMENT. ANY INFORMATION SUBMITIED BY RESPONDENTS TO THIS PRE-SOLICITATION NOTICE IS STRICTLY VOLUNTARY. PROGRAM BACKGROUND The NAWCAD IBST Department, in collaboration with the USAF and the Army�s Redstone Test Center, has a requirement for the acquisition, systems integration into the Government test and evaluation facilities and live virtual constructive environment test systems architectures, modeling customization to support specific systems under test, technical troubleshooting, upgrades and repair of delivered systems of� scene generator systems to support test and evaluation of fixed wing (FW) F-35 Distributed Aperture System (DAS) missile warning system (MWS), DAIRCM, LAIRCM, CIRCM, and LIMWS rotary wing (RW) sensor systems. REQUIRED CAPABILITIES The Government's requirement for the system will include, but is not limited to, the following: 1 General Requirements 1.a.Possess a Secret Facility Security clearance with safeguarding up to Secret 1.b. Possess and maintain Defense Contract Management Agency (DCMA) approved business systems, particularly Accounting, Property Management and Purchasing, throughout contract performance.��� 1.c. Demonstrated direct scene injection capability on military aircraft weapons sensor systems. System Requirements 2 Operation Modes 2.a Multi-mode operation for system initialization, modeling, display (debug), local and external operational modes 2.b Capability to operate in a System Initialization Mode 2.c Capability to operate in a Modeling Mode 2.d Capability to operate in a local standalone Image Generation Mode 2.e Capability to operate in External Control Image Generation Mode 2.f Capabiity to operate in only one mode at a time 2.g Configurable capability option for simultaneous operation of all sensors in a MWS 3 Model Building 3.a Capability to build radiometrically correct radiance maps for the terrain database 3.b Provide a Graphical User Interface (GUI) to facilitate construction of the atmospheric database elements, employing the full potential capability of the MODTRAN code 3.c Provide a Graphical User Interface (GUI) to create and modify both static and dynamic, point and extended, source models 4 Scenario Development and Builder 4.a Shall possess a scenario configuration file editing capability for XML with a GUI interface 4.b Shall generate and deploy 3D Terrain databases 4.c Shall support real-time image generation of WGS-84 format terrains 4.d Shall generate and deploy atmospheric databases 4.e Shall generate and deploy target and threat definitions 4.f Shall import and host dynamic and static player models within the synthetic environment 4.g Shall schedule the scenario events 4.h Shall view and debug the scenario configuration file 4.i Shall use validated digital models for scenario development 4.j Shall perform scenario validation on new or updated scenario configuration files 4.k Shall provide visual 60 Hz display of locally controlled scenario trajectory pre-scripting 4.l Shall provide a reconfigurable frame size (Maximum of 2048 by 2048) 5 Simulation Control 5.a Shall provide the user with a GUI for accessing Simulation Control functions 5.b Shall provide both local and external control options 5.c Shall perform Waypoint positioning and modeling 5.d Shall generate Unit Under Test (UUT) definitions to include configurable, spectral responsivity, angular Field of view, frame rate, pixel dimensions, sensor locations and orientations, and platform vibration 5.e Unit Under Test (UUT) horizontal and vertical full-angle FOV size shall range from 1 degree to 120 degrees corresponding with the sensor�s look-angle and FOV perspective 5.f Horizontal and vertical full-angle FOV shall have a resolution of 2?/24bits radians or better 5.g Shall simultaneously generate, configure, and control the System Under Test (SUT) definition for at least 8, and up to 30, sensors 5.h Shall support dynamic Field of View (FOV) switching within a sensor frame time 5.i Shall provide a scenario display that monitors an active run scenario in real-time and non-real-time debug mode 5.j Image generation channel shall accommodate separately defined frame sizes 5.k Shall provide a scenario display that monitors a run scenario in real-time and non-real-time 5.l Shall provide integrated modeling for external control using XML with a schema validation definition file 5.m Shall provide synchronization control for multiple sensors using external interface for real time position and event updates and triggered frame synchronization signals 6 Scenario Development 6.a Shall be able to accept common atmospheric, digital elevation, material attributed terrain, and threat models with the capability to perform integrated modeling using XML with schema definition files 6.b Shall perform radiance computation using the spectral responsivity per sensor 6.c Target signatures and atmospheric effects shall be computed at a user-defined spectral sample resolution to account for spectral variations across the waveband of the sensor 6.d Targets shall have the correct radiance level contrast against the sky background for operational test fidelity 6.e Shall optically mask parts of extended or point sources when scenario players or player features at closer range cause overlap in a SUT FOV 6.f The point source model shall compute the radiant intensity of a threat's contribution to a pixel�s radiance level as a function of altitude, Mach and aspect angle, sensor-to-threat range, azimuth and elevation, and wavelength 6.g The point source model shall compute the radiant intensity as affected by the atmospheric effects of transmission loss and path radiance consistent with MODTRAN 6.h In external control mode, the dynamic player models shall be controllable by trajectory pre-scripting and reactive scripting of position, orientation, and discrete signature state changes as time varying parameters 6.i The real-time image generation shall support application of a High Frequency Temporal (HFT) scaling factor to the point source 6.j Image generation performance shall demonstrate fidelity throughout the complete FOV 6.k Shall receive scripted control, as well as real-time entity control from government furnished (GFE) trajectory models such as BLUEMAX, TRAP, DISAMS, MOSAIC, TMAP, and ESAMS. This shall be defined in the ICD 6.l Shall create and modify both static and dynamic extended source, facet-based models 6.m Shall create 3D terrains as Open Scene Graph Binary (OSGB) files using the OpenSceneGraph terrain database-building tool, �VirtualPlanetBuilder,� for creating OSG-compatible, large scale, paged databases from validated geospatial imagery and digital elevation maps 6.n Shall possess the capability to simulate flare-blinding of the sensors 7 Real-Time Image Rendering via the GPU 7.a Shall support interpolation of scenario players' continuous states 7.b Shall support the extrapolation of scenario players' states up to 10 frames into the future 7.c Static emissive and static radiant modeling of the synthetic environment is required 7.d Shall simulate point source target models to render sub-pixel-sized targets without the aliasing effects that occur when rendering sub-pixel-sized, facet-based models 7.e Shall dynamically render a multi-spectral synthetic environment at the framerate of the unit-under-test (UUT), 7.f Each SG image generation channel shall meet or exceed 300 Megapixels throughput per second with each output pixel having 16 bits of dynamic range 7.g Shall support local and external positional updates at the SUT-required frame rates 7.h Shall possess the capability to render at least eight simultaneous threats 7.i Shall maintain target positions out to 500 miles with an accuracy of at least 2 feet rms 7.j Shall provide a capability to switch from a point source model to a facet-based threat model at the system-calculated transitional range to provide a realistic optical transition 7.k Shall accommodate dynamic signature insertion for player models using the scripted trajectories 7.l Controller shall use multichannel synchronization for simultaneous operation of the sensors 7.m Shall provide two-color synchronization for all two-color sensors 7.n Shall provide an interface that is capable of external IRIG time standard synchronization 7.o Shall use an external synchronization interface for configuration and distributed executive control for up to 2 kHz framerate, with jitter less than 0.1 ?seconds 7.p Shall limit latencies to <10 ms to the master system clock 7.q Shall provide a SCRAMNet interface to accommodate high performance External Control 7.r Shall provide an Ethernet network interface for command and control from an external controller 7.s Shall provide an Interface Control Document (ICD) for external control with command and control messages 7.t Shall provide a method to set the radiance gain and offset synthetic imagery at desired radiance levels 8 Sensor Modeling 8.a Sensor modeling tool shall convolve the imagery, emulating the distortion, focus, and modulation transfer function, to provide a variable size, asymmetric kernel that allows mapping kernel variability across the sensor�s field of view (FOV) 8.b Sensor model shall perform per pixel gain, offset and bad pixel removal 8.c Sensor modeling tool shall process the image�s pixels for noise modeling, non-uniform response modeling, and A/D conversion 8.d Sensor model shall emulate imaging blurring effects from sensor motion and integration 8.e Shall possess the capability to insert pseudo-INS rates into the sensor header data 9 System Digital Interface 9.a Shall provide the method to directly configure graphics card output in projector mode by selecting one of a list of user-specified graphics card configurations 9.b Shall provide the method to automatically configure graphics card output format in direct injection mode through software computation of graphics card configurations defined by the UUT frame size and frame rate specifications 9.c In direct inject mode, the SG shall match the digital video interface protocol 9.d Shall supply a digital image over the required digital output interface and a means to configure the output to match the SUT-specified data rate and sensor format 10 Data Collection and Post Processing 10.a Shall record received commands and output commands with time tags for all dynamic scenario runs 10.b Shall log data collection without affecting the scene generator�s framerate or data image fidelity 10.c Shall log system parameters including IRIG time, target attributes, clutter attributes, modifications made when using override commands, and more 10.d Each log file shall be organized and stored in a navigable manner 10.e Shall have a playback capability from the Data Acquisition Unit (DAU) of the captured multi-channel data for post-test forensic data investigation, problem-solving analysis, and effective performance system and scenario evaluation 11 Security and Privacy Requirements 11.a Shall have duplicate removable computer hard drives 11.b Shall possess and employ a non-destructive procedure for the erasure and zeroization of all memory devices 12 System Environment Requirements 12.a Computers shall execute hard-deadline, real-time processing 12.b Operating system and all software shall be developed to meet the latest DoD information assurance requirements 12.c Shall be housed in Electronic Industries Alliance (EIA) industry standard equipment racks 12.d The equipment rack, when loaded, shall not exceed the weight and power limits of the test facilities 12.e The system shall have the ability to be remotely turned on and off 12.f Shall be designed to operate using 115 VAC 60 Hz power source 13 Deliverable Documentation 13.a Scene Generator Manual 13.b External Control ICD 13.c System Test Reports ELIGIBILITY The anticipated North American Industry Classification (NAICS) code for this requirement is 334111 with a Small Business Size Standard of 1,250 employee. The anticipated product service code (PSC) is 4935. SUBMISSION DETAILS Interested vendors should submit a capabilities statement in a document of no more than 8 pages containing no smaller than 12 point font. This document shall specifically address, and demonstrate the vendor�s capability to meet, the required capabilities noted in this posting. Additionally, all responses shall include Company Name, CAGE Code, Address, Business Size, and Points-of-Contact (POCs) including name, phone number, fax number and mailing address.������������������������������������������������������������������������������� All responses to this notice shall be submitted electronically to Basirat Shonekan-Umaru @ basirat.d.shonekan-umaru.civ@us.navy.mil
- Web Link
-
SAM.gov Permalink
(https://sam.gov/opp/af81154be39047bfb4230d231edb33f3/view)
- Place of Performance
- Address: Lexington Park, MD 20653, USA
- Zip Code: 20653
- Country: USA
- Zip Code: 20653
- Record
- SN06230134-F 20220203/220201230113 (samdaily.us)
- Source
-
SAM.gov Link to This Notice
(may not be valid after Archive Date)
| FSG Index | This Issue's Index | Today's SAM Daily Index Page |