SOURCES SOUGHT
A -- Request for Information: Literature Search and Summarization Tools
- Notice Date
- 9/11/2024 6:30:18 AM
- Notice Type
- Sources Sought
- NAICS
- 541715
— Research and Development in the Physical, Engineering, and Life Sciences (except Nanotechnology and Biotechnology)
- Contracting Office
- FDA OFFICE OF ACQ GRANT SVCS Beltsville MD 20705 USA
- ZIP Code
- 20705
- Solicitation Number
- 9112024
- Response Due
- 10/28/2024 8:00:00 PM
- Archive Date
- 11/12/2024
- Point of Contact
- Ian Weiss, Phone: 3017965728
- E-Mail Address
-
Ian.Weiss@fda.hhs.gov
(Ian.Weiss@fda.hhs.gov)
- Description
- Request for Information: Literature Search and Summarization Tools Purpose THIS IS A REQUEST FOR INFORMATION (RFI) ONLY. This Request for Information (RFI) is being issued in accordance with Federal Acquisition Regulation Part 10, Market Research. It is for information, planning, and market research purposes only and shall not be construed as either a solicitation or obligation on the part of the Food and Drug Administration or its Centers. The RFI process will consist of a 45dayinformation solicitation phase. The FDA will use the results of the market research to determine if small business sources are capable of satisfying the FDA�s contracting goals based on responses to the RFI. �Responders to this RFI may include commercial or not-for-profit organizations. FDA will consider teaming arrangements/partnerships or joint ventures. FDA will not award a contract based on responses nor otherwise pay for the preparation of any information submitted or FDA�s use of such information. Responses to this notice are not offers and cannot be accepted by the Federal Government to form a binding contract. Information obtained because of this notice may be used by the FDA for program planning on a non-attribution basis. Eligibility in participating in a future acquisition does not depend upon a response to this notice. Responses will be reviewed only by FDA personnel and will be held in a confidential manner. The purpose of this RFI is to support the FDA in gaining insight into the landscape of available commercial tools for assisting in the identification and summarization of medical literature pertinent to drug safety research. Of primary interest are tools leveraging artificial intelligence, such as language models or natural language processing (NLP) methodologies, to classify documents according to terms and keywords defined by the user. Tools with the added ability to summarize and extract information based on specific user instructions are also of interest. Additionally, the FDA seeks insight into the tools' capacity to integrate with existing data systems, adapt to specific IT infrastructure requirements, and undergo rigorous testing to ensure the workflow compatibility, validity, reliability, and relevance of the results. Background The FDA Adverse Events Reporting System (FAERS) serves as the Agency's primary safety reporting mechanism where Individual Case Safety Reports (ICSRs) are collected and used to identify and assess adverse drug reactions not detected during the drug development phase as well as enhance the understanding of known adverse drug reactions. A major limitation of FAERS is the frequently missing information or incomplete narrative of the clinical presentation of the events which often makes evaluating the true associations between drugs and adverse events challenging. This issue, among others, necessitates that safety evaluators utilize a variety of drug information resources to support the investigation of potential new safety signals. Among these resources, medical literature is invaluable in providing updated insight into significant developments in clinical practice that may relate to undetected or emerging safety concerns. However, challenges exist when confronting the vast array of publication databases and ad-hoc literature libraries available for safety evaluators to query. This task is further complicated by the time-consuming nature of manually screening and extracting data from abstracts and full-text publications. There is considerable interest in utilizing artificial intelligence, particularly NLP, to assist in identifying and prioritizing publications relevant to specific research areas. Beyond the initial classification of relevant sources, the ultimate goal is for the tool to appropriately summarize and extract meaningful information from the identified publications, effectively streamlining, standardizing, and accelerating this component of safety surveillance. Objectives The FDA seeks information on tools that leverage artificial intelligence to assist in the screening and identification of publications relevant to drug safety research. Additionally, there is a high interest in tools equipped with NLP functionalities capable of processing, summarizing, and extracting information based on a criterion, set of keywords, or generative prompts provided by end-users. Lastly, a crucial feature of these tools should be their ability to be tested for workflow compatibility, validity, reliability, and relevance. Information Requested Interested parties should provide responses to the following questions below: Implementation � The government is seeking a browser-based solution that would run in the vendors environment. Please provide information if vendor product is 508 compliant. Information on 508 compliances is available at https://www.section508.gov/manage/section-508-assessment/ Interoperability � To what capacity (if any) does the tool support connectivity and data sharing with other enterprise information systems (e.g., safety databases, analytical platforms, dashboards, Microsoft Office)? Does the tool provide connectivity through an application programing interface (API)? Primary Data Sources � What scientific, academic, or scholarly data sources are utilized to provide results to user queries (e.g., Medline, Embase)? Does the data originate from a centralized repository or a decentralized, cloud-hosted environment? Is it possible to expand the library from which the tool queries information? Supplementary Data Sources � To what capacity (if any) does the tool complement its primary data source with additional auxiliary databases or external datasets (e.g., stand-alone text corpuses, social media data, online drug information databases, independent publication libraries)? Search Flexibility � To what extent can users influence and tailor the criteria for article selection? Does the tool offer agile and adaptable methods allowing users to pinpoint conceptual themes within the text, whether through exact word matches, semantic concept associations, or conversational prompts (similar to those used in applications like ChatGPT and Google Gemini)? Automation � What capabilities does the tool offer for automating routine and scheduled query execution? How customizable are these features, particularly in terms of conforming to organizational templates that specify search criteria, content filtering, and results delivery? Additionally, to what extent can the tool�s interface be tailored to accommodate individual user workflow preferences? Text Summarization � To what extent (if any) does the tool provide a text summarization feature and how can users intuitively and predictively trace back to source documents based on the returned summaries? Furthermore, are users able to extract and download article citations in a range of flexible formats (e.g., RIS, EndNote)? Results � How effectively can users explore results using features like text highlighting, sorting, filtering, and data visualization? Is it possible for users to be directed to the specific location in the original document where the information was found? To what extent can users obtain the full manuscript text or results in structured formvia direct download, traditional export options (e.g. .csv, .xlsx, .pdf), or integration with other information systems. Validation and Development � To what extent and by which method has the tool�s performance been validated? Which benchmarks have been employed to measure the tool's accuracy, reliability, and relevance in delivering valid search and summarization results? How is continuous quality improvement implemented when considering current and evolving challenges within the domain of artificial intelligence? Would your organization be willing to conduct a series of customized tests designed by its intended user base to assess how well the tool fits the specific use cases within the agency, and provide access to the tool on a 30-90 day trial basis during the period leading up to and throughout application testing? Strengths � What are the strengths of the tool in comparison to other generative AI tools (e.g., Perplexity, Consensus, Elicit)? Specifically, how is the tool uniquely suited to conducting literature searches and creating summaries that are relevant to surveillance and epidemiological research? Limitations � What are the limitations of the data source in ensuring search results are reliable and reflective of the comprehensive availability of scholarly information on the topic of interest? Additionally, what shortcomings does the language model have when intuitively tracing back information to source documents and interpreting the rationale upon which the results are based? Business Information: Please provide the following Business information: DUNS Number and Unique Entity Identification Number Company Name Company Address Company Point of Contact, phone number and email address Socio-economic status (e.g., small, 8(a), VOSB, SDVOSB, HUBZone, SDB, WOSB, etc.) Type of company under North American Industry Classification System (NAICS), as validated via the SAM. Additional information on NAICS codes can be found at www.sba.gov. Any potential government contractor must be registered on the SAM located at http://www.sam.gov/index.asp. RFI Submission Instructions Responses shall be submitted to Bernice Nelson Contract Specialist via email at bernice.nelson@fda.hhs.gov, and to Ian Weiss Contracting Officer at ian.weiss@fda.hhs.gov.� Please include the RFI number in the response.� Do not send information that requires a non-disclosure agreement or sensitive business information. Telephone inquiries will not be accepted or acknowledged, and no feedback or evaluations will be provided to companies regarding their submissions. Interested companies must submit responses to the questions limited to 10 pages, not including appendices. Other than the cover letter, the response document(s) shall have no watermarks, no header or footer notations, etc. identifying the organization. General Capabilities Statements will not be accepted. Responses will not be returned and will not be accepted after the due date. Note that FDA reserves the right to contact one or more of the respondents if additional information is required or to request demonstrations of available systems.
- Web Link
-
SAM.gov Permalink
(https://sam.gov/opp/1d7565bfa518478e83cf94bad894844f/view)
- Place of Performance
- Address: Silver Spring, MD 20993, USA
- Zip Code: 20993
- Country: USA
- Zip Code: 20993
- Record
- SN07207820-F 20240913/240911230126 (samdaily.us)
- Source
-
SAM.gov Link to This Notice
(may not be valid after Archive Date)
| FSG Index | This Issue's Index | Today's SAM Daily Index Page |