Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY ISSUE OF AUGUST 25, 2010 FBO #3196
SOURCES SOUGHT

U -- Fully-Automated Oral Proficiency Interview (OPI) Administration and Evaluation System

Notice Date
8/23/2010
 
Notice Type
Sources Sought
 
NAICS
611710 — Educational Support Services
 
Contracting Office
USAR Contracting Center - West (POM), Bldg 4385, Suite 2041, 400 Gigling Road, Seaside, CA 93955
 
ZIP Code
93955
 
Solicitation Number
W9124N10SS0003
 
Response Due
9/20/2010
 
Archive Date
11/19/2010
 
Point of Contact
Todd Bales, 831-242-4078
 
E-Mail Address
USAR Contracting Center - West (POM)
(todd.s.bales@us.army.mil)
 
Small Business Set-Aside
N/A
 
Description
Fully-Automated Oral Proficiency Interview (OPI) Administration and Evaluation System FOR INFORMATIONAL PURPOSES ONLY THIS IS NOT A SOLICITATION ANNOUNCEMENT. This sources sought solicitation is for information and planning purposes only and is issued for the purpose of market research in accordance with FAR Part 10. The Dept of the Army, Contracting Center West Monterey, CA is currently requesting information to determine the existence of viable large and small businesses that are interested and capable of performing the work described herein. Responses are welcome from all business sources. This sources sought solicitation is not limited to any socioeconomic program and/or business set-side. As a result of this notice, the Government may issue a Request for Proposal; however there is no solicitation available at this time. This notice is for planning purposes only, and does not constitute an Invitation for Bids, a Request for Proposals, Solicitation, Request for Quotes, or an indication that the Government will contract for the items contained herein. This notice is not to be construed as a commitment on the part of the Government to award a contract, nor does the Government intend to pay for any information submitted as a result of this notice. The Government will not reimburse respondents for any cost associated with submission of the information being requested or reimburse expenses incurred to interested parties for responses to this sources sought notice. Any responses received will not be used as a proposal. This is not a request for proposals and in no way obligates the Government to award any contract. OBJECTIVES As the volume of requests for spoken foreign language testing have increased dramatically in recent years, and new tester training and refresher training are increasingly costly, there is growing interest in developing an automated system for evaluating FL oral proficiency using the Interagency Language Roundtable (ILR) language proficiency scale. In particular, recent discussions regarding foreign language requirement policy for some government organizations have indicated the possibility of an unprecedented demand for high volume OPI testing for lower proficiency candidates, such as Level 0+ to Level 1 on the ILR scale. To respond to this potential testing requirement, the Defense Language Institute Foreign Language Center (DLIFLC) is conducting market research in developing a Fully- Automated Oral Proficiency Interview (OPI) Administration and Evaluation System for very low proficiency foreign language speakers. DLIFLCs objective is to develop a system that satisfies all conditions for a high-stake FL Oral Proficiency Test measuring Non-Native Speakers (NNS) proficiency, or communicative competence, in a target language. The developed system should be clearly distinguished from those systems yielding an estimate of ones FL oral proficiency, which are usually designed as low-stake tests for screening, placement, or monitoring purposes. In this context, the NNS is any speaker whose primary language of use is English, and who is learning a foreign language. The languages of interest will be determined from among those critical to current DoD missions. The following languages currently taught at DLIFLC are examples of those languages: Modern Standard Arabic (MSA), Arabic Dialects (Levantine and Iraqi), Chinese, Japanese, Korean, Tagalog, Thai, Russian, Spanish, Serbian-Croatian, French, Italian, Portuguese, Hebrew, Farsi, Pashto, Dari, Turkish, Kurdish dialects (Kurmanji and Sorani), Hindi, Urdu, Uzbek, and Indonesian The requirements are divided into following four aspects: Test Design; Validation; Test Delivery and Security; and Evaluation (Scoring). TECHNICAL REQUIREMENTS Test Design The fully-automated OPI administration and evaluation system will assess very low-range proficiency. DLIFLC expects the range of proficiency levels for the automated system to be from ILR Level 0 to ILR level 1. Level 0+ and Level 1-speakers are described as having the following features and abilities: (Visit http://www.govtilr.org/Skills/ILRscale2.htm for full description) Level 0+: Speakers are able to satisfy immediate needs using rehearsed utterances and communicate with memorized material consisting of words and phrases. Level 1: Speakers do not usually speak unless spoken to. They make numerous mistakes and have a heavy accent. They can create sentences. With Level 1 proficiency, they can participate in simple short conversations on familiar topics, can satisfy basic survival needs (such as travel arrangement, food and clothing needs) and they can also ask and answer simple questions. A test designed for a Fully-Automated OPI Administration and Evaluation System should consist of tasks clearly relevant for examinees to demonstrate their proficiency, or communicative competence. Tasks types proposed by the offerors will be vetted by the government personnel before development work begins. The preference is to preserve as many tasks from the traditional OPI as possible, as permitted by the technology. DLIFLCs intention is to obtain a fully-automated assessment system equivalent or close to the face-to-face OPI in regards to the constructs of assessing general FL oral proficiency. The table below presents the task composition and some sample questions of the traditional face-to-face OPI by level. Table 1: Composition of Traditional Face-to-Face OPI Tasks and Sample Questions ILR Level 0+ (-15min)ILR Level 1 (-20min) Task CompositionWarm-Up (2-3 min) Simple Short Conversation Questions about Date Questions about Basic Colors Questions about Basic Objects Questions about Months Questions about Clothing Wind-Down Warm-Up (5-6 min) Simple Short Conversation Past Narration Role-Play Survival Situation Description Examinee Ask Question Wind-Down Sample QuestionsQuestions about family, hobby, education, travel, occupation, etc. (warm-up); When is your birthday?; When is Christmas? What date is it today? ; What are the colors in the American flag? What color is your shirt? What color is your hair? What color is your car?; What do you have in your bedroom? -Questions about family, hobby, education, travel, occupation, etc. (warm-up); -Simple Short Conversation: Do you have a sister? How old is your sister? What does your sister do? Where does she live? What is her hobby? -Role-Play Survival Situation: Do a Role-Play regarding booking a travel arrangement. -Examinee Ask Question: Ask the tester five questions about the testers hobby. In order to enhance the reliability as well as the robustness of a fully-automated system, it may be necessary to modify some of the face-to-face OPI tasks to fit a framed format so that the expected response remains at a manageable level. For example, in one recently- developed, computer-delivered but human scored OPI testing system, the traditional role-play task of two parties has been modified to a one-person role-play task. An example of a task is: You need to call a restaurant to make a reservation for a group. Ask the receptionist five questions in making the arrangement. The examinee is then required to record the responses within the given time. In addition, it is acceptable to include innovative novel tasks that utilize the multi-media capabilities inherent in computer-delivered testing. All such tasks, however, MUST allow foreign language speakers to directly demonstrate their oral proficiency, or communicative competence, in a target language. Validation The newly-developed test has to prove its validity and its compatibility with the traditional ILR OPI. The offeror will be responsible for the validation study conducted using empirically- and statistically -sound methodologies. DLIFLC will not assist with data collection in validating the comparability between the new form of the test and the traditional ILR OPI. It is offerors responsibility to find a sufficient number of samples for the validation. Test Delivery and Security The fully-automated Oral Proficiency Interview (OPI) Administration and Evaluation System may be equipped with an Interactive Intelligent Dialogue Model to simulate the face-to face interaction of the OPI. However, if Intelligent Dialogue Model is used, it must demonstrate its high robustness. If this is not practical enough, then it is recommended that a manipulation of the task design be used as addressed in the Test Design section above. The test delivery system may be equipped with an adaptive feature, which is the capability to prompt an appropriate level of task based on the quality of the previous responses. The OPI administration and evaluation system must be web-based. The offeror should be technically able to control and protect Sensitive but Unclassified (SBU) data and Privacy Act data and comply with all DoD, Army and DLIFLC authentication and authorization policies. Evaluation (Scoring) Speech Recognition: The ideal evaluation system must be equipped with state-of-the-art speech recognition technology which thoroughly works with Non-Native Speaker (NNS)s spontaneous speech especially for very low proficiency NNS. To satisfy the content validity requirement, the speech recognition system should be able to perform lexical segmentation and recognition. Evaluation: The fully-automated evaluation system must return a value, i.e., level, using the ILR scale, which will be either 0, or 0+, or 1. This holistic evaluation must be computed based on those features representing content construct, i.e., the OPI rating factors--lexical control, structural control, delivery, global tasks and functions, texts produced, and sociolinguistic competence. Features that are not directly representing ones oral proficiency are not encouraged to be a part of the evaluation even if they may show statistical significance in predicting a reliable evaluation (e.g. reaction time). The desired fully-automated evaluation system should be able to demonstrate content validity in the process of computing the final ILR evaluation. Offerors should note that this objective to maintain the construct validity may impose the selective use of the analysis results based on well-known pattern-finding methodologies, such as machine learning and data mining. The contributing feature set as well as their contributing ratio to the final evaluation should clearly support the construct validity of the fully automated OPI evaluation as an authentic oral proficiency assessment measure. Offerors should note that distinctive challenges are present in evaluating very low proficiency candidates. The desired evaluation system should be able to capture the fine difference of comprehensible speech vs. incomprehensible speech. According to the ILR description, it is expected that NNSs with very low proficiency will produce speech samples with heavy accents and with many grammatical errors. In traditional OPIs at very low proficiency levels, examinees are not penalized as long as the response is comprehensible and is of appropriate content to the prompt, despite any heavy accent or grammatical mistakes. This means that the recognition and analysis of speech output should be designed to determine the comprehensibility of responses rather than native-like speech. In addition, offerors should note that speech delivery features like pause, fillers, intonation, etc., should be considered with caution when evaluating very low-range proficiency speech. For example, a memorized formulaic speech by a Level 0+ NNS may have delivery quality that is better than a non-formulaic sentence formed by a Level 1 NNS. CALL FOR INFORMATION Any interested vendors should submit a Capability Statement demonstrating the professional expertise and technology to achieve the specified requirements addressed above. The statement may include the following information but is not limited to: i)Strategies or methodologies to achieve the requirements addressed above; ii)Previous experience with similar projects; iii)Development work of tests or testing systems; iv)If any, patented technology (including technology that is patent pending). The statement may also include your firms background and relevant qualifications or industry-specific awards. Developing a Fully-Automated Oral Proficiency Interview (OPI) Administration and Evaluation System requires a wide range of expertise, including Foreign Language Assessment, Applied Linguistics, Second Language Acquisition, Psychometrics, Natural Language Processing (NLP), Artificial Intelligence, and Information Technology (IT), etc. Not all interested vendors may currently have all the required expertise. If more than one vendor works in partnership with another vendor, the statement must indicate which party is responsible for providing which specific types of expertise. In addition, if involved in a partnership, the statement must indicate any previous collaborative experience among or between the vendors. This solicitation for proposal is for INFORMATIONAL PURPOSES ONLY and does not constitute an implied contract with any vendor. Contact Information To submit a Capability Statement as outlined above, send information to: Todd S. Bales Contract Specialist 400 Gigling Road, Suite 2202 Seaside CA 93955 or e-mail to: todd.s.bales@us.army.mil (place 'DLI - OPI Administration System' in header)
 
Web Link
FBO.gov Permalink
(https://www.fbo.gov/notices/520d3cb82f7027b1731f4b0fb7be8391)
 
Place of Performance
Address: USAR Contracting Center - West (POM) Bldg 4385, Suite 2041, 400 Gigling Road Seaside CA
Zip Code: 93955
 
Record
SN02250376-W 20100825/100823234554-520d3cb82f7027b1731f4b0fb7be8391 (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.