SOLICITATION NOTICE
D -- Conflation Technology
- Notice Date
- 4/22/2004
- Notice Type
- Solicitation Notice
- NAICS
- 541519
— Other Computer Related Services
- Contracting Office
- Other Defense Agencies, National Imagery and Mapping Agency, St Louis Contracting Center (ACSS), Attn: ACSS Mail Stop L-13 3838 Vogel Road, Arnold, MO, 63010-6238
- ZIP Code
- 63010-6238
- Solicitation Number
- Reference-Number-ESEAandNORTHROPGRUMMAN
- Response Due
- 4/28/2004
- Archive Date
- 5/13/2004
- Point of Contact
- Karen Edgar, Purchasing Agent, Phone (314) 263-4211 x105, Fax (314) 263-8024, - Phyllis Barr, Contract Officer, Phone (314)263-4211+x117, Fax (314)263-8024,
- E-Mail Address
-
edgark@nga.mil, barrp@nga.mil
- Description
- Based on previous market research conducted by the National Geospatial Intelligence Agency (NGA), the Agency proposes to award sole source contracts to ESEA and Northrop Grumman to develop conflation technology using their commercial off the shelf software as a baseline to better suit the specific production needs of NGA. It is the desire that the deliverable product conflate mapping, charting, and geodesy data from one segment with the geospatial intelligence feature data from another segment. This is a technology that merges data to attain a "best of breed." In addition to providing conflation software, the contractor will provide, as appropriate, licenses, training, technical support, and travel. To satisfy NGA's minimum needs the following salient characteristics are identified: The current set of requirements is directed at functionality that is essential for NGA's current and emerging conflation needs. The requirements below are numbered for reference. Numbers do not indicate priority. Overall requirements 1 Ease of use is paramount. The software should be intuitive and easy to learn. 2 Footprint is preferred to be lightweight to minimize software and hardware cost. 3 Ease of integration into production is critical. The software should be flexible in its ingest and should have open interfaces. 4 Processing speed is important. Exact amounts of data and processing speeds are not established. 5 Scalability needs to be defined. Memory requirements to handle larger datasets should be well defined so software does not bomb during operational use. 6 Defaults are needed when software includes tolerances or choices of algorithms. The vendor should provide defaults, and should document the impact of changing tolerances or algorithms used. It should be possible to kick off a default process with virtually no choices other than inputs. 7 Source metacontent must be available to the user for viewing, so the user can evaluate the currency, accuracy, resolution, etc. of both input sources at any point during processing. 8 Conflation of all feature geometries to all other geometries is required. It is acceptable that different geometries be conflated in different software runs, but the vendor should define workflows, since sequence of conflation tends to affect outcome. 9 Dynamic segmentation is required to match and deconflict portions of line features. 10 Conflation of more than two datasets is required (for example: use aero source for all aero geometry and most attributes, add aero attributes from VMap2, pull in all other VMap2 cultural features, pull in DNC coastline and anything in the water). However, it is acceptable to successively conflate pairs of datasets. If this approach is used, the vendor should provide guidance on the effect of processing sequence, and should recommend a processing sequence. E.g., start with datasets containing the best geometries, add datasets with the best attributes, continue to lower quality datasets. 11 Ingest and output of VPF is required.* 12 Dirty data must not be a barrier to the conflation process.* 13 Population of metacontent must be possible at the dataset/session, feature, and attribute levels. The precise metacontent requirements are not defined, so software should exhibit the capability of automatically generating and populating metacontent at these three levels. 14 Output of a log file is required. The log file should describe the input datasets, commands, and output datasets. The precise content requirements are not defined, so software should exhibit the capability of automatically generating a logfile of the information the vendor deems most useful. Feature linking requirements 1 Powerful editing tools are needed for manual feature matching, since automated feature matching will seldom be 100% correct or complete. 2 Undo capability is required. When a user performs interactive edits, it should be possible to undo at least 10 of the previous edits. 3 Iterative processing of automated and manual feature matching would be useful. Automated feature matching would never override manual edits, but would improve in each successive run because manual matches are removed from consideration. 4 Quality assurance tools should provide for an entire dataset not just the matched features be reviewed by the user. 5 Consideration of attribute values is required to improve feature matches. It must be possible to define what would comprise a match (i.e., which attributes, and which values of the attributes). It must also be possible to define what would NOT comprise a mismatch (i.e., if a value is NULL or unknown, it is neither a match nor a mismatch). 6 Consideration of elevation values is required to improve feature matches. Deconfliction requirements 1 Use of source attributes in deconfliction rules is required. For example, which source to use for geometry or attribute values could be based on source characteristics for different feature types on the same conflated result. Source characteristics could include surveyed, stereo, mono, reported. 2 Flexibility in populating the geometry of the conflated result is required. It should be possible to include geometry from either source based on feature type, based on attributes (e.g., choose the more recent geometry), and whether it was matched or not. It should also be possible to include connectivity vectors or not. 3 Quality assurance tools should automatically detect gaps, cross-overs and overshoots. It should also be possible to specify rules to check the context of conflation results (i.e., did any building objects get repositioned into a body of water, or does a railroad now run through a mountain without benefit of a tunnel). 4 Creation of connectivity vectors to ensure a continuous network is required. These should be optional. 5 Flexibility in defining the attributes of the conflated result is required. For example, the output schema could be a union of the input schemas, an intersection, a mixture of the two schemas, or a new schema altogether (with rules that map the source attributes to the target attributes). 6 Flexibility in populating the attributes of the conflation result is required. For example, an output attribute could be populated by one of the other source, by a function of either source value, or by a rule (e.g., if source A's attribute is null, use source B's attribute). 7 Manual tools for populating attributes would be useful. Some conflations are so complex that rules would be too unwieldy to develop. For example, if a bridge represented by three spans on one source is conflated to a point feature on the output source, a user is best qualified to carry over the attributes to from the multiple spans to the point feature. 8 Flexibility in populating the elevations of the conflated result is required. It must be possible to define the output geometry as a combination of the lat/long from one source and the elevation from the other. Because portions of segments can match, this implies that the deconfliction software must be able to interpolate a Z value mid-segment.
- Place of Performance
- Address: Bethesda, Maryland
- Zip Code: 20816
- Zip Code: 20816
- Record
- SN00571332-W 20040424/040422212558 (fbodaily.com)
- Source
-
FedBizOpps.gov Link to This Notice
(may not be valid after Archive Date)
| FSG Index | This Issue's Index | Today's FBO Daily Index Page |