Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY - FEDBIZOPPS ISSUE OF JANUARY 08, 2017 FBO #5525
MODIFICATION

99 -- NRM Automated Testing Tool Solutions (RFI) - Responses to Questions

Notice Date
1/6/2017
 
Notice Type
Modification/Amendment
 
NAICS
541990 — All Other Professional, Scientific, and Technical Services
 
Contracting Office
Department of Agriculture, Forest Service, WO-AQM IT Support, Albuquerque Service Center, Pan American Bldg, Suite 200, 101 B Sun Ave NE, Albuquerque, New Mexico, 87109, United States
 
ZIP Code
87109
 
Solicitation Number
AG-7604-S-17-0001
 
Archive Date
1/25/2017
 
Point of Contact
David Watson, Phone: 505-563-7535, Robert Hinton, Phone: 5052649819
 
E-Mail Address
davidwatson2@fs.fed.us, rhinton@fs.fed.us
(davidwatson2@fs.fed.us, rhinton@fs.fed.us)
 
Small Business Set-Aside
N/A
 
Description
Q&A NRM Automated Testing Tool Solutions For United States Forest Service, Natural Resource Manager (NRM) This Request for Information (RFI) is for information purposes only and shall not be construed as a commitment or a promise of a contract by the Government. This is not a solicitation. This notice does not constitute an Invitation for Bid (IFB), Request for Quote (RFQ) or Request for Proposal (RFP), nor does it restrict the Government as to the ultimate acquisition approach. 1. PURPOSE: The US Forest Service (FS) Natural Resource Manager (NRM) is conducting market analysis with this RFI to obtain feedback, further refine and identify the "best-fit" automated testing tool/software. The primary focus is identified below. The information collected in this RFI will be evaluated and assist the FS in the appropriate acquisition approach/strategy going forward. Responding to this RFI does not automatically include the responder on any bidders list, if such a list is developed. Depending on the quantity and quality of the responses received, the criteria may serve to generate the initial pool of potential contractors to be invited to respond to a resulting solicitation. 2. INTRODUCTION AND BACKGROUND: NRM supports 14000+ Forest Service users with 65 + custom applications. These applications are used for upward reporting to Congress and senior management, and for information gathering and analysis. Currently, the business knowledge for functional testing is not harvested and reused. The testing team performs manual regression and sanity testing for monthly releases of 65 NRM applications Furthermore, during each round of testing for a release, test cases are executed manually and only documented only if a bug is discovered or an error occurs. Also sometimes the code fix for a bug or issue may break functionality that had pass prior testing or prior releases. Due to the environments being set-up or changed, system upgrades, patching, and cloning, NRM is looking for the most suitable, most effective alternatives and solutions for obtaining and fulfilling the needs associated with automated testing software. In addition, redesign/modernization efforts have added significant volume of testing, since these efforts are implementing entire applications vs maintenance. With a limited testing resources, having an automation tool can assist in completing the repetitive, time consuming tasks and deadlines. 3. OVERVIEW OF CAPABILITIES: Projected/required Capabilities for an automated testing software include by priority the following: Functionality and Usability Requirements: 1. Regression, load, and performance testing of NRM applications, both tabular and geospatial. Includes validation of geospatial output by image compare, metadata analysis, or other appropriate method 2. Ease of recording or scripting testing scenarios 3. Intuitive playback and test suite implementation 4. Powerful reporting, preferably with graphical interface 5. Ease of maintaining and upgrading the automation suite, including graphical baselines for geospatial applications under test 6. Test Traceability - linking requirements to functional testing 7. Test result log for error messages, warnings etc. 8. Bug reporting - integration with TeamForge 9. Exporting Test results to various formats (Excel, XML, HTML etc.) 10. Effective accessibility and Section 508 compliance (optional) 11. Free trial period and available onboarding support Technical Requirements: 1. Address interoperability of the testing tool with authentication (eAuth (SiteMinder) and Oracle Internet Directory (OID)) 2. Modular integration with existing dependent systems a. Cross Browsers (Internet Explorer, Chrome, or Mozilla Firefox) b. Natural Resource systems i. Java ii..Net iii. Oracle forms (optional) iv. Oracle Reports v. Jasper Reports vi. Cognos Reports vii. Geographic Information System Interface (ESRI) 3. Flexible system to handle multiple devices (Desktop, Tab, Mobile) 4. Multiple Environments (Local, Citrix, Cloud) 5. Multiple Platforms (Windows, Linux, Mac, Android, IOS) 6. Hardware Configuration Requirements (System Configurations required for Server Machine and Client machines). Note: Hardware requirements will be provided by vendor for their testing software. Must work with NRM applications which has Oracle Database, Oracle Fusion Middleware (application/web Tier), running on Solaris servers, and is integrated with SiteMinder (eAuth) and OID directory. These are accessed from desktops, Citrix blades and ESRI client tools. 7. Network Requirements (Bandwidth requirements for both Server and Client machines). Note: the testing software should have minimal network bandwidth overhead. This request intends to gather information from potential solution providers on the opportunity to utilize and/or leverage existing solutions, best practices, and recommended approaches. We appreciate your response. This is not a request for proposals. Responses will be considered in the context of your choice and will help inform decisions about future approach. Respondents are encouraged to consider the following elements in their response: 1. Essential features that should be considered in the user interface 2. Core elements that should be considered for geospatial capabilities 3. Authentication/Security features relevant to Federal requirements 4. Best practices for developing/enabling mobile use 5. Identify any risk factors that could include financial, lifecycle, quality, cost/future costs and any other strategies to consider in implementing an automation testing tool 6. Identify any best practices of acquiring automated testing tool 4. QUESTIONS: Questions regarding this notice shall be and will only be submitted in writing by via email. Please submit your responses to David Watson, email: davidwatson2@fs.fed.us and Robert Hinton, email: rhinton@fs.fed.us by 4:00pm, Mountain Standard Time (MST) on 10 January 2017. Questions will be answered by posting answers to FBO websites. The Government does not guarantee that questions received after time/date specified above will be answered. Note: please clearly mark any information that is proprietary, The Government will not reimburse respondents for any costs incurred in the preparation of a response to this notice. 5. SUMMARY: This is a Request for Information (RFI) only to identify sources that can provide as stated the above information. The information provided in the RFI is subject to change and is not binding on the Government. The FS has not made a commitment to procure and or take the approach recommended. Any of the items discussed, and release of this RFI should not be construed as such a commitment or as authorization to incur cost for which reimbursement would be required or sought. The FS is not seeking elaborate replies, but rather concise, meaningful responses from vendors with the required expertise. In 15 single-sided 8.5" X 11" pages or less (title page or table of contents do not count against the page count), All submissions become Government property and will not be returned. Please provide the following general information: Automated Testing Tool Solution(s) For United States Forest Service, Natural Resource Manager (NRM) This Request for Information (RFI) is for information purposes only and shall not be construed as a commitment or a promise of a contract by the Government. This is not a solicitation. This notice does not constitute an Invitation for Bid (IFB), Request for Quote (RFQ) or Request for Proposal (RFP), nor does it restrict the Government as to the ultimate acquisition approach. 1. PURPOSE: The US Forest Service (FS) Natural Resource Manager (NRM) is conducting market analysis with this RFI to obtain feedback, and further refine and identify "best-fit" automated testing tool/software solution (s). On area of emphasis is to test modernized and redesigned applications, and if possible also the legacy applications. Additional information is identified below. The information collected in this RFI will be evaluated and assist the FS in the appropriate acquisition approach/strategy going forward. Responding to this RFI does not automatically include the responder on any bidders list, if such a list is developed. Depending on the quantity and quality of the responses received, the criteria may serve to generate the initial pool of potential contractors to be invited to respond to a resulting future solicitation. 2. INTRODUCTION AND BACKGROUND: NRM supports 14000+ Forest Service users with 65 + custom applications. These applications are used for upward reporting to Congress and senior management, and for information gathering and analysis. Currently, the business knowledge for functional testing is not harvested and reused. The testing team performs manual regression and sanity testing for monthly releases of these 65 + NRM applications Furthermore, during each round of testing for a release, test cases are executed manually and only documented if a bug is discovered or an error occurs. Additionally at times the code fix for a bug or issue may break functionality that had pass prior testing or prior releases. Due to the environments being set-up or changed, system upgrades, patching, and cloning, NRM is seeking the most suitable, most effective alternatives and solutions for obtaining and fulfilling the needs associated with automated testing software. As well, redesign/modernization efforts have added a significant volume of testing, since these efforts are implementing entire applications vs. maintenance. With limited testing resources, having an automation tool will assist in completing the repetitive, time consuming tasks and deadlines. The software should have some type of feature to allow non-programmers (the testing team) to generate test cases without having to write scripts. 3. OVERVIEW OF CAPABILITIES: Projected/required Capabilities for an automated testing software include by priority the following: Functionality and Usability Requirements: 1. Regression, load, and performance testing of NRM applications, both tabular and geospatial. Includes validation of geospatial output by image compare, metadata analysis, or other appropriate method 2. Ease of recording or scripting testing scenarios 3. Intuitive playback and test suite implementation 4. Powerful reporting, preferably with graphical interface 5. Ease of maintaining and upgrading the automation suite, including graphical baselines for geospatial applications under test 6. Test Traceability - linking requirements to functional testing 7. Test result log for error messages, warnings etc. 8. Bug reporting - integration with TeamForge 9. Exporting Test results to various formats (Excel, XML, HTML etc.) 10. Effective accessibility and Section 508 compliance (optional) 11. Free trial period and available onboarding support 12. Test case generation for non-programmers 13. Third party web site authentication Technical Requirements: 1. Address interoperability of the testing tool with authentication (eAuth (SiteMinder) and Oracle Internet Directory (OID)) 2. Modular integration with existing dependent systems a. Cross Browsers (Internet Explorer, Chrome, or Mozilla Firefox) b. Natural Resource systems i. Java ii..Net iii. Oracle forms (optional) iv. Oracle Reports v. Jasper Reports vi. Cognos Reports vii. Geographic Information System Interface (ESRI) 3. Flexible system to handle multiple devices (Desktop, Tab, Mobile) 4. Multiple Environments (Local, Citrix, Cloud) 5. Multiple Platforms (Windows, Linux, Mac, Android, IOS) 6. Hardware Configuration Requirements (System Configurations required for Server Machine and Client machines). Note: Hardware requirements will be provided by vendor for their testing software. Must work with NRM applications which has Oracle Database, Oracle Fusion Middleware (application/web Tier), running on Solaris servers, and is integrated with SiteMinder (eAuth) and OID directory. These are accessed from desktops, Citrix blades and ESRI client tools. 7. Network Requirements (Bandwidth requirements for both Server and Client machines). Note: the testing software should have minimal network bandwidth overhead. This request intends to gather information from potential solution providers on the opportunity to utilize and/or leverage existing solutions, best practices, and recommended approaches. We appreciate your response. This is not a request for proposals. Responses will be considered in the context of your choice and will help inform decisions about future approach. Respondents are encouraged to consider the following elements in their response: 1. Essential features that should be considered in the user interface 2. Core elements that should be considered for geospatial capabilities 3. Authentication/Security features relevant to Federal requirements 4. Best practices for developing/enabling mobile use 5. Identify any risk factors that could include financial, lifecycle, quality, cost/future costs and any other strategies to consider in implementing an automation testing tool 6. Identify any best practices of acquiring automated testing tool 4. QUESTIONS: Questions regarding this notice shall be and will only be submitted in writing by via email. Please submit your responses to David Watson, email: davidwatson2@fs.fed.us and Robert Hinton, email: rhinton@fs.fed.us by 4:00pm, Mountain Standard Time (MST) on 6 January 2017. Questions will be answered by posting answers to FBO websites. The Government does not guarantee that questions received after time/date specified above will be answered. Note: please clearly mark any information that is proprietary, The Government will not reimburse respondents for any costs incurred in the preparation of a response to this notice. 5. SUMMARY: This is a Request for Information (RFI) only to identify sources that can provide information regarding an automated testing tool solution(s). The information provided in the RFI is subject to change and is not binding on the Government. The FS has not made a commitment to adopt a solution as of yet and or procure or take an approach that is recommended without further analysis. Any of the items discussed, and release of this RFI should not be construed as such a commitment or as authorization to incur cost for which reimbursement would be required or sought. The FS is not seeking elaborate replies, but rather concise, meaningful responses from vendors with the required expertise. In 15 single-sided 8.5" X 11" pages or less (title page or table of contents do not count against the page count). All submissions become Government property and will not be returned. Please provide the following general information: 1. Company Information (Name, address, and DUNS) 2. The Automated Testing Tool Solution that is recommended. 3. Feedback- addressing the stated information above. 4. Current Contract vehicles that might be good fit -such as GSA Schedules, BPA's, GWACS (NASASEWP, NIH NITAAC)). With similar same scope, size and complexity of contract vehicles you currently have or within the last three years. Describe the capabilities, features, and benefits of your similar contracts, experience and what pricing approaches have been used in your federal contracts? 5. Indicate whether are/are not a small business, a small disadvantaged business, 8(a), hub-zone certified, woman-owned, or any other social economic category. 6. Any additional information that may be helpful to the FS. 1. Feedback -Address above information 2. Company Information (Name, address, and DUNS) 3. Current Contract vehicles (that might be good fit -such as GSA Schedules, BPA's, GWACS (NASASEWP, NIH NITAAC)). With similar same scope, size and complexity of contract vehicles you currently have or within the last three years. Describe the capabilities, features, and benefits of your similar contracts, experience and what pricing approaches have been used in your federal contracts? 4. Indicate whether are/are not a small business, a small disadvantaged business, 8(a), hub-zone certified, woman-owned, or any other social economic category. 5. If possible an idea of what performance requirements standards in your current contracts held by our company? 6. Any additional information that may be helpful to the FS.
 
Web Link
FBO.gov Permalink
(https://www.fbo.gov/spg/USDA/FS/WOAQMITS/AG-7604-S-17-0001/listing.html)
 
Record
SN04365760-W 20170108/170106234237-0f587dc3f5ba0013689a04be2844bd8b (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.