Digital Information Technology Acquisition Professional (DITAP) Training Program
Content Map
The following content map contains all of the artifacts for the Digital Information Technology Acquisition Professional (DITAP) training program, in addition to an overview of the program. However, this content map does not provide DITAP content found in the edX web-based learning management system.
Certification
The Federal Acquisition Certification in Contracting – Digital Services (FAC –C/DS) core-plus competency model identifies the minimum competencies required to specialize as a federally certified digital service acquisition expert. The specialization is based upon the technical competencies identified in this document. This specialization focuses on digital service acquisition knowledge necessary to award and administer IT and digital service acquisitions. This specialization differs from the 1102 series competency models in the following ways:
- As a FAC-C/DS core plus specialization, this model focuses on those competencies that are specific to the acquisition of digital service supplies and services. It requires participation in an approved development program which includes both classroom, self-study, skills assessments, practical application, and competency based learning elements. It does not include individual courses, which if taken in a combination, will be equivalent to taking the development program.
- The 1102 series competency model is a job series and grade specific. The FAC-C/DS core-plus certification may be obtained regardless of job series.
Competencies
The competencies and corresponding performance outcomes related to the FAC-C/DS certification are listed below.
Competency |
Performance Outcome |
1. Digital Services in the 21st Century Government |
Describe digital services in the 21st century, including what they are, who provides them, how they are delivered, and why they are important |
2. Understand What You Are Buying |
Determine the problem to be solved while effectively supporting and communicating with the customer and industry |
3. How do you buy |
Effectively use techniques for acquiring digital service solutions in your solicitation or acquisition strategy |
4. Awarding & Administering Digital Service Contracts |
Conduct and award digital service contracts, using and applying metrics and incentives appropriately, and applying proper course correction when necessary |
5. Leading Change as a Digital IT Acquisition Professional |
Apply techniques to create a culture of innovation within your sphere that enables you and others to effectively lead and influence customers to the best solutions |
6. Application of Skills |
Apply techniques learned in the course through various activities |
The FAC–C/DS core-plus competency model will be periodically reviewed to include updates or additional criteria. The program assesses students’ knowledge, expertise, and maturity in each competency using the Bloom’s Taxonomy model: aware, describe, act, and teach.
The federal government’s approach to building information technology has not kept up with how it is done in the commercial marketplace. Even when government IT program offices are ready to get started buying digital services, purchasing professionals do not have the training needed to buy such services. Training exists for purchasing in general, and training exists for digital services, but there was not sufficient training for how to purchase digital services.
To transform how the government builds and buys digital services, the Office of Federal Procurement Policy (OFPP) and the United States Digital Service (USDS) issued a challenge to industry in 2015 to create and pilot a training and development program that transforms how Federal Contracting Professionals procure digital services. The challenge sought to achieve three primary outcomes for Federal Contracting Professionals:
· Become digital service procurement experts;
· Become equipped with the knowledge necessary to be embedded within agency digital service teams to serve as business advisor to the team, its customers, and its stakeholders; and
· Possess the knowledge to lead agency training, workshops, and consultations in order to expand digital service procurement expertise within their agency and the government.
To achieve these outcomes, the USDS created the Digital Information Technology Acquisition Professional (DITAP) training program. The pilot program was designed using agile learning design, which included (1) build instruction into segments (two-week iterations which comprise one-month releases), measure it using regular assessments, and learn in an iterative fashion, and (2) fix some performance objectives/instruction (60%) while allowing the remainder of the instruction (40%) to flex to address individual and cohort learning needs. Twenty- eight Federal Contracting Professionals graduated from the pilot program with their core-plus specialization in digital services.
Data was gathered from participants and the program faculty over the six-month program to document findings and recommendations from the pilot. Following the success of the pilot program, a second and improved version began August 2016, considered the Minimum Viable Product (MVP) of DITAP training program. The MVP was condensed from a 6-month to a 4-month program, taking a break near the end-of-fiscal year in September during the students’ most heavy workload at their respective federal agencies.
How to facilitate the program:
In taking on this challenge to run the DITAP Program for yourself, here is information that will help you facilitate launching your own version. In addition, later in this document you will find specific recommendations related to the DITAP Pilot, and separately, the DITAP MVP. USDS anticipates hosting a workshop that will provide more specific recommendations.
Program Design
Initial Challenge Submission
Participant Recruitment
DITAP Application Instructions
Course-Level Assessments
Pre-Program
Summary of Program Pre-Assessment Results
Raw Data of Program Pre-Assessment Results
Item- Level Data of Program Pre-Assessment Results
Post-Program
In-Class Sessions
Orientation Session (Session 1)
Slides
SBA Case Study
SBA Procurement Case Study – Facilitator Notes
Oral Presentation Instructions
Instructions, Conditions, and Notices to Offerors
US Digital Service Discovery Sprint Part 1
US Digital Service Discovery Sprint Part 2
Session 2 (Release 2)
Slides
Facilitator Summary
Activities
Difficult Conversation Role Play
Session 3 (Release 3)
Slides
Facilitator Summary
Activities
Metrics Development Exercise
Session 4 (Release 4)
Slides
Facilitator Summary
Activities
Assessments
LDA Shark Tank Rating Forms
Additional Facilitation Material
Iteration 1
Iteration 1.A
Iteration 1.B
Assessments
Pre-Program Assessment and Release 1 Summary of Results
Release 2
Iteration 2.A
Iteration 2.B
Assessments
Release 2 Assessment Summary of Results
Release 3
Iteration 3.A
Iteration 3.B
Release 4
Iteration 4.A
Iteration 4.B
Assessments
Release 4 Assessment Summary of Results
Threaded Scenario
The MAP case study was used throughout the program as a threaded scenario. This fictitious case study represented the procurement lifecycle from requirements definition through contract award determination. As the participants progressed through the training program, they also progressed through the different stages of the procurement lifecycle in the case study. More specifically, they were required to complete the following activities through the course of the program:
- Analyze the digital services need
- Develop the acquisition strategy
- Create a Request for Quote
- Evaluating tradeoffs among vendor quotes
The threaded scenario included the following documents and content:
MAP Case Study Materials
Link to edX Iteration 3.B > MAP Case Study: Developing the RFP - Part 1
Link to edX Iteration 4.A > Activity: Blog Your Acquisition Package
Digital Services Acquisition Expert Facilitator Presentations
Implementing Agile in Government
The Product Vision
Live Digital Assignment
The Live Digital Assignment (LDA) is a group project throughout the program to take what the participants learn and apply it to a real digital challenge faced by an agency. The assignment builds teamwork skills, helps participants to practice consultative, critical thinking, and problem solving skills. It also requires identifying a relevant digital services challenge and then hypothesize, research, and prototype a digital services product, service, or tool to solve that problem.
Instructions
Link to edX Iteration 1.A > Live Digital Assignment: Task 1
Link to edX Iteration 1.B > Live Digital Assignment
Link to edX Iteration 2.A > Live Digital Assignment: The Product Vision
Link to edX Iteration 2.B > Live Digital Assignment: Demo Day 1
Link to edX Iteration 3.A > Live Digital Assignment: Developing a Plan for How to Test
Link to edX Iteration 3.B > Live Digital Assignment
Link to edX Iteration 4.A > Release 4 Live Digital Assignment Instructions
Coaching
Each Live Digital Assignment team was assigned a Digital Service Expert who served as their coach. The coach provided guidance to their LDA team throughout their project, and also fulfilled the role as champion for their team. The coach normally interacts with their respective LDA teams during in-class sessions and during teleconference meetings. Each coach was the coach for at least two LDA teams.
Applied skills badges chart
Bronze/Silver/Gold method/mindset
Participation
Shadowing
The shadowing assignment requires participants find an opportunity to shadow a digital service implementation/delivery team. The assignment builds familiarity with modern design and development approaches, and requires participants to engage directly with digital services teams to learn about their work. It also builds trust among acquisition professionals and customers across government and/or industry. Participants must shadow for at least 16 hours a product manager, developer, or anyone on a digital services delivery team within or outside of government.
Instructions
Link to edX Course Introduction > Shadowing
Link to edX Iteration 1.A > Shadowing Reminder
Stakeholder Interviewing
The stakeholder interviewing activity requires participants to engage with stakeholders relevant to digital services through highly structured interviews. During the interviews, participants clarify their understanding of the issue, understand what motivates the interviewee, and tries to convince the stakeholder to try something new. A template is provided to participants that help them identify opportunities and challenges in working with stakeholders and establish a framework for the interviews. A separate template is provided to identify questions to ask stakeholders.
Link to edX Course Resources > Stakeholder Interests Template
Link to edX Course Resources > Stakeholder Interview Guide Template
Overall Grades
Live Digital Assignment Summary
Live Digital Assignment Final Scores
Facilitator Skill Sets and Estimated Level of Effort
To effectively facilitate the program, it is recommended to have personnel with the following qualifications fulfill the program facilitator roles:
Labor Category Title |
Years of Experience & Desired Knowledge/Skills/Abilities |
Number of FTEs* |
Facilitator |
· 6-8 years of experience, with at least five years of facilitation experience in a variety of environments (i.e., both traditional classroom and virtual or asynchronous learning environments), on a variety of topics (i.e., both technical and soft skills), and with a variety of audiences (i.e., both junior and mid-level audiences). Preferred: The facilitator should have a demonstrated track record of positive evaluations from participants. · [If DiSC assessment is used] Experience debriefing the DiSC assessment · Expertly shifts among multiple training delivery methodologies as needed to meet project-specific requirements, such as lecture, case study, leading small group exercises, or simulation |
0.36 [approximately 250 hours were required for the four-month (MVP) version of the program] |
Digital Services Acquisition Expert Facilitator |
· At least eight years of experience in the acquisition field, with at least four years of Federal government IT/digital services acquisition experience · Actively participates in government digital services acquisition professional forums/organizations that provide up-to-the-minute awareness of emerging digital services acquisition trends and best practices · Delivers instructional content in the classroom or an alternative learning environment, such as a synchronous or asynchronous web-based learning environment · Draws upon subject-matter and domain expertise to provide insight into digital services acquisition mission and challenges. Asks thought-provoking and engaging questions of participants, and challenges participants to apply new learning to real-world situations. |
0.36 [approximately 250 hours required over four-month (MVP) version of the program] |
Agile/Scrum Facilitator |
· Facilitator who is certified in agile methodologies to facilitate agile instruction during orientation · Demonstrated experience delivering agile instruction to non-technical audiences in ways that are understandable and relatable. Expertly adjusts and responds to participants’ incoming knowledge levels. |
0.03 [approximately 20 hours required over the four-month (MVP) version of the program] |
Digital Services Expert |
· At least five years of experience in digital services field with at least three years of experience as a developer involved with agile software development and/or the development of other digital services · Actively participates in digital services professional forums/organizations that provide up- to-the-minute awareness of emerging digital services trends and best practices · Demonstrated experience speaking on and/or delivering training on complex digital services topics to non-technical audiences in ways that are understandable and relatable. Adjusts and responds to participants’ incoming knowledge levels. |
0.05 [approximately 40 hours required over the four-month (MVP) version of the program] |
*Note that the training vendor calculated the Number of FTE estimates based on hours expended during the four-month MVP program. These estimates include involvement of the facilitators in select program design workshops, but they do not include estimates associated with the government digital services SMEs nor the instructional designers, assessment staff, or technologists who are needed to administer the program. In addition, these estimates are based on the current program design and will need to be revisited if and when the current (MVP) design is adjusted.
Pilot (Oct 2015 – Mar 2016): Lessons Learned and Recommendations
The data following the DITAP Pilot suggests three broad categories to be addressed (listed below). See Digital Acquisition Pilot Report Final for more details. The following executive summary presents some of the key recommendations included in the Report, many of which address multiple findings.
Category 1: Program Design and Instructional Strategies. Recommendations in this category include:
- Evolving the program over time. Continuing to use an agile approach for to support program revisions. (adopted)
- Refining our assessment approach and process. Specific recommendations in this area include refining the pre-assessment and IDP to target an ongoing individual and cohort development focus throughout the program and embedding the release assessments within the content of releases. (adopted)
- Adjusting the program outcomes. Revising the program outcomes to focus on knowledge, learning, and application over the six months of structured sessions and activities as true expertise develops over a longer timeframe. In addition, restructure the guided learning Office Hours and Iteration Retrospectives to align with program needs. (adopted)
- Ensuring messaging around outcomes is clear and consistent. Set clear program expectations during Orientation and Iteration Planning Meetings and provide a clear purpose and rationale for how each session and activity fits into the structure of the program. (adopted)
- Building in more application-focused activities, such as case study responses, shadowing, live digital assignments, and other activities. Specific recommendations in this area include providing threaded examples/scenarios that allow participants to analyze and apply their learning in each phase of the acquisition lifecycle, recrafting the live digital assignment to focus on solving digital service challenges, employing “acq-a-thons” to provide participants with consultative experience, and requiring participants to identify and participate in digital services shadowing. (adopted)
- Embedding leading change content and change ambassador themes related to building participants’ personal brand and network throughout program. Specific recommendations in this area include systematic selection of guest speakers and providing more structure and emphasis on building participants’ personal brand, network, and knowledge of available resources. (adopted)
Category 2: Learner Support Mechanisms. Recommendations in this category include:
- Continuing to use two-pronged Capstone approach, but ensure participants are made aware of it during Orientation. (adopted)
- Keeping participants’ managers aware of the program, but not involving them extensively. (partially adopted)
- Structuring the mentor experience such that mentors provide advice on participants’ learning and development and offer a network of relationships to call for just-in-time development and advice regarding real-world situations. (not adopted)
- Continuing to use badging to incentivize participation by adding features that create a more competitive experience for cohort participants. (adopted)
- Structuring discussion boards to have fewer topics/threads and encouraging participants to respond to each other. (not adopted)
Category 3: Program Administration. Recommendations in this category include having two primary facilitators during the program and developing integrated and automated portal analytics to track results.
Minimum Viable Product (Aug 2016 – Jan 2017): Lessons Learned and Recommendations
Overall Recommendations
Assessment data demonstrates that participants mastered content and achieved program objectives. When comparing the pre- and post-program survey, for all of the topics, there was either an increase in the percentage of participants who were aware, could describe, could act, and could teach key program topics – or the percentage remained stable. Many participants reported they were equipped to take actions associated with program concepts, thus achieving the program goal of creating digital services acquisition professionals who have command of key concepts and can start to apply those concepts to their daily work situations.
The Capstone Skills Test administered at the close of the program showed the majority of participants experienced an increase in comprehension and mastery of Release 2, 3 and 4 content with the exception of Release 1. Evidence demonstrates the majority of program participants gained knowledge in Release 2, 3, and 4 content. Release 1 is the only Release for which a large number of participants saw a decrease in their scores from the Pre-Assessment to the Capstone Skills Test[1].
Next, we present findings and recommendations from the training vendor associated with each overall component of the program: program design, instructional strategies and content creation, delivery, assessment, and technology experience. The US Digital Service anticipates providing a workshop where their specific recommendations will be provided.
Overall Program Design
In terms of the overall program design, the findings and recommendations include the following:
- Participants experienced “entry burden” as they struggled to master highly technical digital services concepts (especially in Release 1). To ensure incoming participants are ready for digital services content, and to minimize additional support required by facilitators and program staff, pre-requisites are recommended as requirements for program entry.
- It is believed that this was due to the sheer breadth of content to be covered. Also, there was not enough coverage of the digital services marketplace and too much focus on agile.
- Pre-requisites can efficiently address this issue, which could be sourced from curated existing content and updated to keep pace with the changing market (e.g., Massive Open Online Courses [MOOCs]).
- Implementation ideas include requiring participants to provide online course completion certifications to program staff prior to Day 1, providing participants early access to the portal such that they can complete pre-requisites there, inviting alumni to host topic-specific webinars (i.e., mini-courses with knowledge assessments) participants must complete before Day 1, and requiring each participant to make a five-minute presentation on a topic they learned from pre-requisites to strengthen collective cohort knowledge and build camaraderie.
- Participants remarked that they didn’t have enough time to complete assignments and assessments. The training vendor recommended adjusting overall program length to allow for more reflection and iterative remediation during program delivery, possibly moving back to a six-month duration.
- The training vendor recommended allocating sufficient upfront time before program launch to review prior cohort outcomes and decide on administrative adjustments, as well as refresh program concepts that will not be changing, thus allowing the pre-assessment to measure more stable program content for each performance objective and permitting in-program time to be spent on the responsive components of the curriculum. Program staff found it challenging to have only one month to revise the pilot program before launching into the MVP.
- The training vendor believed that the training program had reached a maturation point with the MVP where we have a stable set of comprehensive performance objectives. Their assumption was that the current MVP content may need to be refreshed before the next offering but that the performance objectives would largely remain stable.
- An alumni network was established following the MVP cohort to foster community and facilitate continued education. Participants expressed a strong desire to stay connected to one another and program leadership. Program leadership should organize alumni meetings quarterly or every other month.
Instructional Strategies and Content Creation
In terms of the instructional strategies and content creation, the findings and recommendations include the following:
- Participant feedback indicated that the use of the threaded case study (SBA) and scenario (MAP) provided useful application opportunities for participants. The training vendor also suggested the scenario be refreshed to reduce the focus on a COTS product so as to provide more flexibility.
- There was mixed feedback about usefulness of blogging, as it was introduced mid-program. The training vendor recommended continuing the use of blogging but introducing it at the start of the program and better explaining how it can be used optimally.
- The live digital assignment (LDA) is a viable instructional method; however success requires more upfront, realistic expectation-setting.
- Participants are more likely to maximize the value of the LDA if they are required to meet with their coach throughout the program.
- Participant feedback indicated that shadowing provided a useful opportunity for participants to gain valuable exposure to the digital services world. Program alumni could and should help cultivate shadowing opportunities for program participants.
- Participants should be required to hold home agency brown-bags, perhaps as one of the badging requirements. Participants could convene brown-bags around the beginning of the program (e.g., I’ll learn this) and a second later in the program (e.g., I learned this). These provide solid developmental experiences, and serve to increase appreciation within agencies about digital services acquisition.
- Expanding content coverage to include content other than just agile software development. Participants desired more of a focus on topics like purchasing cloud and XaaS. One option is to include these topics under pre-requisites.
Delivery
In terms of the delivery, the findings and recommendations include the following:
- To maintain high quality content and delivery, the training vendor recommended ensuring that there are digital services experts with current experience acquiring digital services involved throughout the program. This expertise is critical for both for the content/assessment components as well as for identifying and involving the appropriate guest speakers.
- Enhance upfront communication to participants and their supervisors to ensure realistic expectations are formed regarding program time commitment and supervisory support.
- Providing regular communication to supervisors to ensure they keep pace with what participants are learning and can support them. Monthly newsletters, iteration emails, invitations to webinars and/or demo days, and bulletins could be sent to supervisors to raise their awareness.
- Guest speakers were well-received and should continue to be involved with small adjustments. Guest speakers need structured expectations for interactive presentations that necessitate high-participant involvement (e.g., 2-3 topics with discussion questions and examples of application to real-world situations), ample questions and answers, and opportunities for participants to practice new skills (rather than being lectured at). In addition, guest speakers should be provided with an overview of the program and specific information about what they should share with the participants in alignment with the performance objectives for the iteration/release in advance of their sessions.
Assessment
In terms of the assessment, the findings and recommendations include the following:
- The training vendor recommended retaining a consistent blend of knowledge- and application-focused assessment questions, across the release assessments and the Capstone Skills Test.
- Ensure assessment instructions emphasize how scenarios and questions are aligned with certain performance objectives. This may be solved by placing labels (related to objectives) next to assessment questions.
- Require the completion of assessments on their due date (or within a shorter “time window” after all release content presented; see “grace period” functionality that can be used in Open edX) to allow for regular cohort-level analysis of the data to rapidly inform in-program modifications and remediation.
- Sharing correct answers for all assessment questions, taking into account the tradeoffs involved.
- Tweak selected assessment questions to align with the level of learning covered during various releases.
- To decrease participants’ confusion about the interrelationships between grading and badging, refine and clarify the grading approach and timing of grading.
- Consider using a set of release pre-assessments rather than both a program pre-assessment and release pre-assessments. The combined set of release pre-assessments may be compared with the Capstone Skills Test to produce pre- and post-program comparisons.
Technology Experience
In terms of participants’ technology experience, the findings and recommendations include the following:
- The portal user experience should evolve as the training program evolves. Specifically, there should be adjustments related to the discussion boards. Students mostly used discussion boards to comply with program requirements rather than for extending their learning. The requirement for discussion boards should be decreased to focus on fewer subjects, and/or the discussions should be limited to replying to others’ discussions as opposed to creating new discussions on the same subject. In addition, the current edX functionality should be changed to eliminate confusion about how to respond to others’ discussions, or use a discussion tool other than edX that provides is more intuitive to the users.
- Continue to include badges but refine the badging user experience. The badging approach was altered mid-program, and participants were still unsure if their completed activities were captured. Also, the badging leaderboard should be seen from the beginning of each cohort, perhaps by indicating only the top ten badging leaders on the leaderboard, rather than listing all participants. As a less desirable alternative, this information could be give each time the cohort comes together (webinar or in person).
- The vendor had to gather badging completion data from badging system Badgr, then was tracked via an Excel spreadsheet. This process should be automated and/or integrating it into the edX Progress tab.
- Recorded walkthroughs of technologies, such as the badging system, may help to increase user understanding.
- Participants found it hard to find materials on the portal given the number of iterations and activity sub-sections. Content should be organized differently within each iteration to decrease the visually cluttered left menu bar.
- The badging functionality is not currently synched with the Open edX default Progress tab/scoring functionality. Moving forward, the Progress tab should either be hidden (as it is now) or better synced with the badging approach.
- If there is continued use of a VPN for content development, the number of simultaneous users in the Open edX Studio (which is the program administrator “side” of the portal) should be limited. However, depending on the final hosting and security environment, this may not be a problem for the government.