Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
SAMDAILY.US - ISSUE OF JULY 18, 2020 SAM #6806
SOURCES SOUGHT

R -- Department of State Digital Minimum Viable Product (MVP) Robotic Process Automation (RPA)

Notice Date
7/16/2020 11:33:14 AM
 
Notice Type
Sources Sought
 
NAICS
5415 —
 
Contracting Office
ACQUISITIONS - AQM MOMENTUM WASHINGTON DC 20520 USA
 
ZIP Code
20520
 
Solicitation Number
19AQMM20N10001
 
Response Due
7/27/2020 7:00:00 AM
 
Archive Date
08/11/2020
 
Point of Contact
Katherine Lugo
 
E-Mail Address
weakley-lugokf@state.gov
(weakley-lugokf@state.gov)
 
Small Business Set-Aside
8AN 8(a) Sole Source (FAR 19.8)
 
Description
INSTRUCTIONS TO INTERESTED PARTIES The Government is hoping to identify vendors that can apply mature digital services to federal agencies. As part of this process, this notice is seeking information from vendors to determine which companies have the skills and interest to support this effort. The necessary capabilities needed in support of this requirement are centered on: Agile Development, Design, API Development, Designer, and Front/Back End Development. Please note, this is a Request For Information and Sources Sought. Responses are for informational purposes only; this IS NOT a Request for Quotes (RFQ). Please see the Statement of Objective for specifics regarding the purpose of this RFI. The Government may engage Vendors that respond to this notice by seeking further information about capabilities or even an 8(a) direct award. Answers to Questions in Section 2 are due NLT July 7, 2020 at 11:00am in the format provided in Section 2 to Katherine Lugo at weakley-lugokf@state.gov. All interested parties should be aware that contractor staff may review RFI responses and anything proprietary should be marked as such and sent in a separate document. SECTION ONE: DRAFT STATEMENT OF OBJECTIVES OBJECTIVE The objective of this requirement is to use digital service techniques to identify and solve a core user need through the development and release to end users of a Digital Minimum Viable Product (MVP). This Statement of Objectives (SOO) is predicated on the fact that Department of State (DOS), Bureau of Information Resource Management (IRM), Operations (OPS), Office of Consolidated Customer Support (CCS) stakeholders understand User Experience (UX) design thinking and/or agile software development. For example, necessary job responsibilities within DOS/IRM/OPS/CCS have been redefined according to agile and/or human-centered design roles, and stakeholders are comfortable enough with modern technology concepts and are ready to start delivery of a digital service product by developing a MVP. In order to achieve this goal, quality partners must be able to establish and execute a framework for user centered design, usability research, and agile implementation.�� OUTPUTS The resulting contract will be considered successful when the following outputs have been delivered: Eight automated processes based on eight CCS user stories identified in this document using UiPath software. Identification of eight new help desk user stories for future automation. Key help desk automation and usability metrics Efficiency report based on success metrics. Develop and deploy an initial digital version of the product in the development and production environments, respectively. Conduct usability research with end users to determine whether the success criteria were met. Determine and document what is necessary to create the levels of efficiency through the solution(s), if successful, or a plan to pivot to a different solution if unsuccessful. PROBLEM STATEMENT� CCS seeks ways to automate routine processes in a user friendly and intuitive manner to reduce manual work performed by humans, thus eliminating redundancy and rework. In order to efficiently and effectively meet key business objectives in the DOS organization, CCS advocates for the design, co-creation, and implementation of automation that will provide value to stakeholders, customers, and users. CCS requires a project cost analysis including cost avoidance for each user story. Four critical user stories have been identified to create process efficiencies with the goal to reduce the �time to resolve.� These user stories are currently completed manually. Data Transfer Request Password Reset Request Data Access Request User Account Disabled Request Four additional user stories include: Testing web-based systems to ensure they are running Follow up emails for customer engagements Automation of account disablement processes and subprocesses Distribution List Modification These user stories have been identified to increase help desk efficiency. Most activities of the help desk are completed manually. BACKGROUND� CCS is modernizing and consolidating DOS helpdesks/service desks. Robotic Process Automation (RPA) is seen as a promising technology that can be implemented to gain efficiencies to existing processes and mitigate the need for additional labor as CCS consolidates 19 disparate helpdesks/service desks. Ideally, this MVP pilots the new technology with an Industry Partner versed in both RPA and help desk modernization. If successful, the user stories listed above as well as new user stories will be used to further RPA improvements as helpdesk are collapsed into a unitary service desk. SCOPE� The MVP consists of four 30-day Sprints consulting with CCS Help Desk Subject Matter Experts (SME), Pilot Users, Testing and Deployment Engineers. These experts will be made available as needed to help the MVP be successful. Functional Area 1: Discovery and Design� Objective: CCS SMEs and the Industry Partner will collaborate to determine how the problem will be solved through digitalization, utilize user centered design techniques and agile product management methods to create an approach to making the Help Desk more efficient, codified through metrics. The appropriate metrics will measured at the beginning of the project and after the pilot users have used the resulting attended bots for an iteration. Any corrections to the approach will be documented and used to revise the user stories for the next iteration Tasks: Conduct user story discovery to determine a digital solution to the agency problem and end user goals for this activity. Create hypothesis statement(s) that can be tested by the release of the MVP. Create a backlog of user stories in addition to those provided by CCS and listed in the SOO. Codify the objectives in metrics used to improvements in Help Desk efficiency. The successful solution, at a minimum, shall also include the following best practices as highlighted in the digital service playbook:�� Understand what stakeholders need� The contractor shall begin this digital service project by exploring and pinpointing the needs of process operators who will use the service, and the ways the service will fit into their daily routine. The vendor shall continually test the products with end users to ensure delivery is focused on meeting their needs. In delivery of this effort the contractor shall:� Early in the project, the contractor shall engage with current and prospective users of the service (SMEs and pilot team). Use a range of qualitative and quantitative research methods to determine people�s goals, needs, and behaviors. Develop and test mockups or prototypes of solutions with real people, in the field if possible. Document the findings about user goals, needs, behaviors, and preferences. Share findings with the team and agency leadership. Create a prioritized list of tasks the user is trying to accomplish, also known as ""user stories"" to be utilized in the agile development process. During development, continuously user-test the MVP to ensure it meets user needs. Measure the resulting efficiencies with the pilot team and project those efficiencies on the entire help desk to show the result of using bots. Functional Area 2: Development of Minimum Viable Product�� Objective: Use agile software development methods a minimum viable product will be developed and released to end users.�� Tasks: Utilize an agile development method to write and manage epics, user stories, acceptance criteria. Conduct iteration retrospectives, release planning, backlog grooming, and other common activities associated with iterative design and agile methodologies. Implement best practice methods for automated testing and code reviews. Work with (DOS/IRM/OPS/CCS) leadership to develop or manage continuous integration, code management processes, security, 508 compliance, privacy or any other agency policies that need to be incorporated in order to release the product into the production environment. Use CCS provided UiPath development environment in the cloud to build the solution up to the point where it integrates with other systems. Deploy the system to the OpenNet production environment for (DOS/IRM/OPS/CCS) testers and pilot users. Test solution with end users and conduct user research activities. Create and track metrics to determine whether the delivered solution meets end users� needs and solves agency problem statement to some degree.�� As a result of the feedback provide recommendations on what should be modified in future versions of the product� The successful solution, at a minimum, shall also include the following best practices as highlighted in the digital service playbook:�� Address the whole experience, from start to finish� The contractor shall holistically account for the different ways people will interact with our services, including the actions they take online, through a mobile application, on a phone, or in person. Every encounter � whether it is online or offline � should move the user closer towards their goal.� In delivery of this effort the contractor shall:� Understand the different points at which people will interact with the service � both online and in person. Identify pain points in the current way users interact with the service and prioritize these according to user needs.� Design the digital parts of the service so that they are integrated with the offline touch points people use to interact with the service.� Develop metrics that will measure how well the service is meeting user needs at each step of the service.� Make it simple and intuitive� Successful delivery of this contract requires that the services of and products delivered will not be stressful, confusing, or daunting. Therefore, the contractor shall build and release a digital MVP that is simple and intuitive enough that users succeed the first time, unaided.� In delivery of this effort the contractor shall:� Use a simple and flexible design style guide for the service. Use UiPath best practices to build production-quality bots for the pilot team. Use the design style guide consistently for related digital services. Give users clear information about where they are in each step of the process.� Follow accessibility best practices to ensure all people can use the service.� Provide users with a way to exit and return later to complete the process.� Use language that is familiar to the user and easy to understand.� Use language and design consistently throughout the service, including online and offline touch points. Use data to drive decisions� At every stage of a project, the contractor shall measure how well our service is working for our users. This includes measuring how well a system performs and how people are interacting with it in real-time. These metrics shall be reported to the (DOS/IRM/OPS/CCS) Program Managers to find issues and identify which bug fixes and improvements should be prioritized. Along with monitoring tools, a feedback mechanism should be in place for people to report issues directly.� In delivery of this effort the contractor shall:� Monitor system-level resource utilization in real time.� Monitor system performance in real-time (e.g. response time, latency, throughput, and error rates).� Track concurrent users in real-time, and monitor user behaviors in the aggregate to determine how well the service meets user needs.� Provide metrics which may be published internally.� Provide metrics which may be published externally.� Functional Area 3: Retrospective: What worked, what didn�t, what to do next�� Objective: Following each iteration, the (DOS/IRM/OPS/CCS) government and contractor team comes together to understand from a project viewpoint to identify improvement actions.�� Tasks: Conduct a project retrospective activity that analyzes data gathered during. performance around goals, timeline, budget, major events, and success or failures� Determine what roadblocks were mitigated and which ones still exist that need to be addressed.�� Provide or update the Product Backlog for scaling the MVP through continuous design and agile processes. Ensure all system documentation, user stories, acceptance criteria and test scripts are finalized.�� Example Iteration Plan: Sprint 1 (30 days): Learn Environment and Business Meet with SMEs for 4 existing user stories Identify Key Help Desk Automation Metrics Identify and Scope User Stories Develop the 4 existing user stories and have bots ready for test Learn Environment (FRAME and Production) Retrospective Sprint 2 (30 days): Work Within Bot Development Lifecycle Meet with SMEs to identify 2 new User Stories Scope User Stories Test 4 user stories Receive Feedback from Test (Our Team) on 4 User Stories - Provide any Rework Develop and deliver 2 new bots Deploy 4 Bots Retrospective Sprint 3 (30 days): Optimize the Business Identify 2 New User Stories to reduce the SDA human interaction by 10 percent in conjunction with Pilot Team/SMEs Identify and Develop a Backlog of 11 high value user stories Test 2 user stories Provide any Rework from Test Results Develop and deliver 2 bots Deploy 2 Bots Retrospective Sprint 4: Wrap up Identify 2 New User Stories to Continue Making SDAs Job Easier Deliver Backlog of Remaining User Stories to Management Test user stories Deploy 2 Bots Deliverables: Updated Concept of Operation, Lessons Learn Report and Efficiency Report Based on Help Desk Automation Metric
 
Web Link
SAM.gov Permalink
(https://beta.sam.gov/opp/28db96ab45d54a108cf2cd8c18d13bda/view)
 
Place of Performance
Address: Washington, DC, USA
Country: USA
 
Record
SN05724154-F 20200718/200716230204 (samdaily.us)
 
Source
SAM.gov Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's SAM Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.