Technology Literacy Assessment Project (TLAP)

Proposal to Colorado Department of Education
Power Results Grant Program

Appendix B — Assessment Developments Elsewhere


Multiple States’ Reports

A TLF NCLB Update — NECC 07

Submitted by Len Scrogan

Boulder Valley School District



At the recent NECC Conference held in Atlanta in July, a significant number of sessions were dedicated to NCLB assessment themes. I attended every session, except one by the Chicago Public Schools (because I was already familiar with what they have been doing). I also attended a large, unpublished, ad hoc special interest session on this topic. In an effort to bring this information back to our state in the absence of CDE leadership (due to recent budget cuts), please consider this information in your district planning.


Critical Finding

NCLB Technological Literacy is not likely to go away in the reauthorization because it is not a core content area and has no large group action behind it.


Three Common Misconceptions

Many technology leaders (state, district, local) continue to exhibit three common misconceptions about NCLB TL as they design an “8th grade test for computer proficiency.”


1st misconception: 8th grade. The law states by the end of the 8th grade, not 8th grade. Schools with viable programs at earlier grades remain confused by an insistence of an 8th grade exam, as opposed to a portfolio review or coursework over multiple grade levels.


2nd misconception: test. It is surprising how many school leaders think this legislation requires a multiple choice, CSAP-like test. The law states “assess,” not “test,” which can certainly be accomplished via more authentic assessment avenues.


3rd misconception: computer proficiency is technological literacy. It is clear that most districts are narrowly defining technology literacy as computer proficiency. Information literacy, STEM and pre-engineering efforts, and 21st Century learning themes are not commonly considered (although they should be).


Interesting Findings

o        Michigan offers a 50 test item MC test

o        One rural Georgia presenter uses a 36-question MC test

o        Cobb County GA uses a performance-based approach with; many other districts in Georgia have been looking at performance-based systems

o        The Georgia Dept of Ed pushes various options to schools statewide, not a single option

o        University of Pennsylvania person prefers a portfolio approach, and they are working on one.

o        One GA district has working on a rubric assessment

o        North Carolina has a Test of Tech skills in 8th grade. They have had it since 1998. It is now a graduation requirement Their past test had 70 questions MC with a performance part. They are switching to a new online test, with a generic set of non-commercial applications.


Best New Models



TECHYES has developed an integrated, project-based approach for meeting this NCLB requirement. They don’t like it when “testing and technology become the END.” They prefer that “technology should be a beginning.” Suffice it to say, they don’t like multiple choice or online assessment of technology skills, but prefer more ‘authentic’ assessment. They have developed a working model that is built on ISTE standards, involves peer mentoring, and can be tackled in almost any school setting.



The state of Florida has had an in-house, online FCAT test for years, so they assumed they would take the  same approach with this NCLB requirement. But instead of beginning with student assessment, they started with teacher tech literacy first. They created standards first, then piloted online, performance-based assessments later. All teacher assessments were also aligned with state-wide individual professional development plans. Only after this initial work was completed did they start work on a student tool for tech literacy, starting with standards first. The student tool is based on 5 indicators:

o        Essential Operational Skills

o        Constructing and demonstrating knowledge

o        Independent Learning

o        Communication and Collaboration

o        Ethical, Legal and Safety Issues  


Field tested in fall, this tool aims to be a FORMATIVE tool that all districts can use to measure student capabilities. While the state or districts can pull summative data from the tool to be used for NCLB reporting purposes, its use in schools was designed to be fun for kids, visually pleasant, and useful for schools to determine where kids are at. The hope is to create a tool that can help identify priorities for creating equity, funding, and formative learning… but not an accountability tool. Since this project was funded by a NSF grant, any state can obtain their test free of charge. It is flash-based. One notion must be repeated: In the words of the state director, this test is “used not to produce test results or accountability, but to move the state forward.


Notes from Stevan Kalmon and Dixie Good, taken from Internet reports, June 2008


Florida’s Student Tool for Technology Literacy


“The interactive and performance based Student Tool for Technology Literacy (ST2L) is currently in the stages of creation. The traditional, research based procedures for instrument development are being followed.  The team of developers consists of measurement experts who will build and evaluate items. The advisory group consists of education and technology experts through out the state that will be continually evaluating items during the development process.  The expert review panel has been working closely with the development team to make final revisions and decisions on indicators.


“The tool will be able to gauge students’ existing level of technology skills. Teachers will be able to use the tool to gather data on students’ current level of technology proficiency. Other applications include using the tool as a pre and post test in combination with classroom experiences to guide students’ technology skill acquisition. The tool will be field tested near the end of the 2005/2006 school year, and it is anticipated to be available for use by all districts some time during the 2006/2007 school year.” (quoted material from “Tool” webpage,



TechYES - Student Technology Literacy Certification Program

Produced by GenYes (accessed 6-11-08)

Grades 6-9

TechYES is an innovative way for schools and community organizations to offer a technology certification program to students in grades 6-9. As with all Generation YES products, students are at the center of the solution - backed up with solid research and extensive resources.

In TechYES, students show technology literacy by creating projects that meet state and local technology proficiency requirements. As part of TechYES, a structured peer-mentoring program assists the teacher or advisor, and provides student leadership opportunities that serve to further strengthen the program and enrich the learning community.


Meets the NEW ISTE NETS Standards for Students

TechYES is a revolutionary program that provides middle school and after-school educators everything needed to offer students an authentic path to technology literacy certification. TechYES helps schools meet the ISTE NETS technology standards for students and satisfy the NCLB technology literacy requirement for eighth graders.


Project--Based Learning Designed for the Middle Years

TechYES is highly flexible, allowing schools and community organizations to choose materials and practices that suit their specific needs. Program materials have been designed specifically for middle school students -- not watered down from adult vocational technology certifications.


Flexible Implementation Models

o        Technology classes

o        Integration into core classes

o        After-school programs

o        Community organizations

o        Clubs and homeschools


TechYES encourages all students to complete technology projects that are creative and personally involving. The projects can also meet requirements for core curriculum classes or community service. These projects are the basis for the TechYES evaluation and certification. The program includes all necessary resources: individual student guidebooks, customized teacher/advisor materials, handouts and resources, access to a fully interactive support website, and certificates of completion.


Based on extensive research, these materials are the basis for creating a self-sustaining program focusing on significant student leadership through peer mentoring. This student involvement, combined with a cost-effective, 3-stage certification process, moves all students towards technology competency even if a school cannot schedule a required technology class for all students.


Schools can offer the TechYES Student Technology Literacy Certification program as part of an existing technology class, integrated into any subject class, or after school.



KSDE Guidelines for 8th Grade Technology Literacy Assessment (9-26-06), published by Kansas State Department of Education; available at (accessed 6-8-08).

Recommended Curriculum

KSDE developed state technology standards (located with in the Kansas Model Curricular Standards for Library Media & Technology) in 2006 based primarily on the NETS for Students.   It is recommended that districts use this document as the primary resource and adapt it for developing benchmarks, indicators, and instructional activities at each grade at the local level.


Assessment Methods

The method of assessment used is determined at the local level. The assessment method can be:

o        knowledge based (test)

o        grades in a required 8th grade course

o        performance based (checklist/rubric)

o        e-portfolio based (collected over a period of years)

o        project based

o        combination of any of the above


These assessment methods can be used with each standard individually or clustered where it is appropriate. They can be done in content areas or they can be done as a stand-alone effort.



Blog comments from NECC ‘07: Assessing Student Technology Literacy, June 27th, 2007 by Jill [last name unknown]


Sylvia Martinez - Generation YES

Looking for authentic assessment of kids being center of technology called “Tech YES.” Assessment is always the tail that wags the dog. Student guides the process (peer assessment), they use criteria that matches the ISTE NETS standards. Talks about sharing, writing, creativity and project-based collaboration skills. Students should be using real technology for a real purpose - personally meaningful. Authentic assessment is hard, takes time and teacher focus. Working in a number of states. There is only one way to perform assessment. Each school and grade may be different.


“The test means it’s over.” Technology literacy should open the doors, not indicate you are done.


Mia Murphy - NC Dept. of Juvenile Justice & Delinquency Prevention

Mia Murphy presentation


Kate Kemker - Florida Dept. of Education

Built their own Florida assessment - outsource with separate company. Years ago created inventory for teachers with performance-based assessment. Skills performed are scored as the test progresses. Worked with researchers to get standards. Broke into six sections similar to NETS, came up with performance indicators to track proficiency in those areas. Survey to make sure others agreed on the important issues. Pilot allowed for feedback from various teachers with different researchers (design and focus groups). Also involved the teacher’s union. Implemented, aligned with their professional development plans. Teachers could do the assessment at their leisure, taking different sections at different times if they wanted. Then teachers can take their results into building their professional development.


Student Tool for Technology Literacy then developed that mirrored the same process using NETS. Framework has five sections: essential operational skills, missed, missed, independent learning, independent ethical issues.



“Tests of Tech Literacy Not Widespread Despite NCLB Goals,” by Scott Cech, in EdWeek, 1/30/08, pp. 1, 12;

online in at


“While that term has no universal definition, the core idea could be boiled down to this: Technologically literate students not only know how to operate hardware and software—they can also analyze the information flowing through it, evaluate that digital content’s relative merit and relevance, and use it creatively and ethically in communicating with others.”



Arizona Pioneers Statewide Measurement of Students’ Technology Literacy Skills, from Learning.Com,


“Breaking new ground, Arizona is the first state in the U.S. to formally measure its students’ proficiencies with technology using TechLiteracy Assessment (TLA) by A spring 2006 pilot program administered’s online TLA instrument to more than 24,000 fifth and eighth graders statewide. And from the results, similar numbers of Arizona grade 5 and 8 students will use TechLiteracy Assessment in the 2006-07 school year.” …


Arizona schools administered TechLiteracy Assessment over a span of about seven weeks. Students used their school’s computer labs to take the online test during a single class period. By all accounts, it was an easy process for all.


“Teachers who proctored the assessment noted that students seemed “very engaged” by the online TLA test. One responded that she would’ve loved to have had a video camera with her to document the phenomenon. Another mentioned how amazingly quiet the computer lab was during the assessment.


“When taking the online test, students interact with assessment content in ways that allow them to demonstrate their proficiencies. Often, they must perform actions via simulations, rather than pick answers from among multiple choices. Thus, students must be able to format a paragraph, apply a spreadsheet formula, or conduct a database search. And they must demonstrate durable skills via generic menus and commands, not through brand-specific memorized shortcuts.”



Issue: The Online Assessment of Technology Literacy

Soapbox Executive Summary

The Online Assessment of K-12 Technology Literacy

IAETE (The Institute for the Advancement of Emerging Technologies in Education at AEL), (2001-2004)


“Though there was not wide agreement on the scope of the word technology, all panelists did share an understanding of literacy as something significantly beyond basic skills. Ripley has a particular interest in the higher order thinking that high-tech tools make possible. Observing that schools now focus more on developing student computer skills to support learning across curricula rather than studying about computers, Ripley asks,


“‘So, can we usefully ponder what subject will replace IT, and why? My view is that the subject must change its outlook from training students predominantly in the skills and capabilities that arise from the existence of personal computers. Instead we should look for the subject IT to concern itself increasingly with the growing range of technologies (mobile devices, blogs, video). It should concern itself with the uses to which technology is put (work, leisure, recreation, purchase). And it should concern itself with the facilities (or capabilities) that those technologies provide to students and adults (voice, visual communication, decision making, choice, responsibility).’


“Ripley's assessments take place in the virtual world of Pepford, where work assignments arrive via e-mail and a "walled garden" provides a virtual world of Web sites and applications. Writes Ripley,


“’The use of simulations is a key development. A simulator provides the context within which authentic assessment tasks can be designed and delivered to students. It also facilitates the development of assessment tasks that invite students to combine a range of capabilities and skills. The combination of these two aspects enables us to assess higher order IT capabilities, such as choice or communication.


“’The test records and scores the actions that the student takes while completing the test. For example, a higher order capability for a 14-year-old student in England includes designing a system for someone else to use. That capability in turn includes an assessment of the end-user's requirements. This we assess dynamically in the virtual world of Pepford by collecting evidence of the student researching into those requirements, by sending and receiving emails to ask about requirements, by the student refining the system to meet requirements and so on. To achieve this we have worked with an extensive range of teachers to document the processes that students go through when producing eloquent or satisfactory responses to the task set. We use this [data from teachers] to create a matrix of plausible routes that a student will take when en route to complete a satisfactory (or better) response.’


“Simulation, of course, is not the only possible method to assess technology literacy. As Pearson points out, "I first will disagree slightly with Mary's contention that assessment of higher-order thinking requires 'new' assessments. Assessments that get at the more complex aspects of student thinking already existin instruments that creatively use extended and open-ended response items, and in some portfolio techniques." He points to an assessment of design capability by the International Baccalaureate as an example. And Pearson observes that while portfolio assessments are often viewed as limited in terms of providing valid and reliable data in a high-stakes arena, they could face fewer problems "if the rubrics for evaluating them are carefully thought through and teachers/evaluators are trained on the rubrics' use." Honey's report also identifies an array of assessments for the 21st Century Skills other than ICT literacy.”



From iSkills Overview, Educational Testing Service,

ETS organized the 2001 International ICT Literacy Panel — an international group of leaders in education, business and government — to analyze issues and approaches to measuring ICT literacy. From this research, ETS partnered with a consortium of institutions of higher education to develop the iSkills™ assessment.

The iSkills assessment helps you ensure your students are ready for success in academia and the workforce. The iSkills assessment:

o        measures your students’ ability to navigate, critically evaluate and make sense of the wealth of information available through digital technology — so you can make the necessary changes to narrow skill gaps

o        is the only ICT literacy test that assesses critical thinking in the digital environment

o        tests the range of ICT literacy skills aligned with nationally recognized   Association of College and Research Libraries (ACRL) standards

o        helps you identify where further curriculum development is needed so students have the ICT literacy skills they need to succeed


Who should use the iSkills assessment?

Any curriculum, department or resource library class can use the iSkills assessment to gain valuable information about student ICT literacy. The assessment is offered at 2 levels of difficulty to measure ICT literacy at different stages of a student’s academic career.


Core Assessment

o        Appropriate for students transitioning into 4 year college programs or completing their freshman or sophomore undergraduate studies

o        Identifies the technical skills needed to complete entry-level coursework


Advanced Assessment

o        Appropriate for students transitioning to upper-level coursework or the workplace

o        Designed with more challenging tasks to help rising juniors and institutions determine student readiness for advanced-level study

o        Evaluates mastery of skills necessary for workplace success



Collegiate Learning Assessment,


“The Collegiate Learning Assessment (CLA) is an innovative approach to assessing your [postsecondary] institution’s contribution to student learning developed by CAE with the RAND Corporation. Our measures are designed to simulate complex, ambiguous situations that every successful college graduate may one day face. Life is not like a multiple choice test, with four or five simple choices for every problem. So we ask students to analyze complex material and provide written responses. The CLA measures are uniquely designed to test for reasoning and communications skills that most agree should be one outcome of a college education.


“Most CLA participants assess their institution cross-sectionally, testing a sample of first year students in the fall and a sample of seniors in the spring. You receive two reports, the first after fall testing that looks at how your entering class compares to other CLA participants (adjusted for SAT or ACT scores). Then after testing of seniors in the spring, you receive a full Institutional Report that evaluates your school's value-added on a comparative basis. Testing every year allows you to measure for effects of changes in curriculum or pedagogy.


“For additional information about the assessment, please review the CLA Brochure. We also encourage you to review an Annotated Sample Institutional Report, which presents excerpts of a sample institutional report and accompanying explanations. If you’d like to review a list of institutions that are currently participating in the assessment, please click here.


“Performance-based assessments are anchored in a number of psychometric assumptions different from those employed by common multiple-choice exams. As such, initiatives like the Collegiate Learning Assessment (CLA) represent a paradigm shift, the technical underpinnings of which remain unfamiliar to many faculty and institutional researchers. Please refer to the CLA Technical FAQs for more information about the development, scoring, and reliability of CLA tasks, as well as other frequently asked questions.”

Vendor Pitch — InfoSource Learning

From: Mala Chakravorty [mailto:[email protected]]

Sent: Tuesday, May 20, 2008 9:21 AM

To: Masson Connie

Subject: RE: InfoSource Learning


Student Assessment from InfoSource Learning

Objective: To meet NCLB Part D, Section 2402 requirement that schools must assess the technology proficiency of all students before the student reaches the 8th grade.


This Module comprises a 60 question test (a pre-assessment and a post-assessment) mapped to ISTE's NETS-S standards. The tests are designed to help technology coordinators and instructors assess and report on the technology proficiency of 8th graders in a simple way, saving them time and effort.


The ISTE NETS-S are divided into the following six sections:

Standard 1: Basic Operations and Concepts

Standard 2: Social, Ethical, and Human Issues

Standard 3: Technology Productivity Tools

Standard 4: Technology Communications Tools

Standard 5: Technology Research Tools

Standard 6: Technology Problem-solving and Decision-making tools


The Test has different versions for Mac and Windows users and contains 10 questions from each of the above sections. Questions contain Multiple Choice, Matching, True False, and Performance Based (hot spots). We will create a testing site for you to enable you to conduct a pre-assessment for students at the start of the school year, plan instruction based on the results, then conduct a post assessment at the end of the school year. Your administrators can pull results whenever needed


The cost of the Test is as follows:

$5.00 per student if purchased in May 2008 (minimum 100 students)

$7.50 if purchased in June 2008 (minimum 100 students)

$10:00 after July 1st, 2008 (minimum 100 students)


Mala Chakravorty, Ph.D.

Senior Account Executive

InfoSource Learning


( toll-free  800.393.4636 x 206

( direct     407.796.5206




Assessment of  Information and  Technology Literacy (report from 2001)

From Washington Dept. of Education,

Links and References



Washington State House Bill 2375


ACRL Information Literacy Standards


ACRL Instruction Section, Objectives for Information Literacy Instruction by Academic Librarians


Washington State K-12 Essential Skills for Information Literacy



Florida International University Libraries. Information Literacy at FIU. []. July 2000.


Florida International University Libraries. Information Literacy on the WWW. []. February 2000.


Holmes, J. W. Tutorials Online. []. May 1999.


University of South Florida. Directory of Online Resources for Information Literacy. []. December 1999.


University of South Florida. Programs, Projects, and Initiatives concerning Information Literacy in Higher Education. []. November 1999.


Washington Library Media Association Online. Information Literacy. []. April 2000.



Angeley, R., and Purdue, J. "Information Literacy: An Overview." Dialogue. []. May 2000.


Bruce, C. Seven Faces of Information Literacy in Higher Education. []. 1997.


Bundy, A. Drowning in Information, Starved for Knowledge: Information Literacy, not Technology, is the Issue. []. February 2000.


National Research Council. Being Fluent with Information Technology. []. 1999.


Oberman, C., Lindauer, B. G., and Wilson, B. Integrating Information Literacy into the Curriculum: How is Your Library Measuring Up? []. August 2000.


Shapiro, J., and Hughes, S. K. Information Literacy as a Liberal Art []. March/April 1996.


Stanford, L. "An Academician's Journey into Information Literacy." New Directions for Higher Education, 1992, 78.


Unitech. Towards Competency. [].