Submitted by Len Scrogan
At the recent NECC Conference held in
NCLB Technological Literacy is not likely to go away in the reauthorization because it is not a core content area and has no large group action behind it.
Three Common Misconceptions
Many technology leaders (state, district, local) continue to exhibit three common misconceptions about NCLB TL as they design an “8th grade test for computer proficiency.”
1st misconception: 8th grade. The law states by the end of the 8th grade, not 8th grade. Schools with viable programs at earlier grades remain confused by an insistence of an 8th grade exam, as opposed to a portfolio review or coursework over multiple grade levels.
2nd misconception: test. It is surprising how many school leaders think this legislation requires a multiple choice, CSAP-like test. The law states “assess,” not “test,” which can certainly be accomplished via more authentic assessment avenues.
3rd misconception: computer proficiency is technological literacy. It is clear that most districts are narrowly defining technology literacy as computer proficiency. Information literacy, STEM and pre-engineering efforts, and 21st Century learning themes are not commonly considered (although they should be).
o The Georgia Dept of Ed pushes various options to schools statewide, not a single option
o One GA district has working on a rubric assessment
Best New Models
TECHYES MODEL http://genyes.com/programs/techyes/research
TECHYES has developed an integrated, project-based approach for meeting this NCLB requirement. They don’t like it when “testing and technology become the END.” They prefer that “technology should be a beginning.” Suffice it to say, they don’t like multiple choice or online assessment of technology skills, but prefer more ‘authentic’ assessment. They have developed a working model that is built on ISTE standards, involves peer mentoring, and can be tackled in almost any school setting.
FLORIDA MODEL http://www.flinnovates.org/sttl/default.htm
The state of
o Essential Operational Skills
o Constructing and demonstrating knowledge
o Independent Learning
o Communication and Collaboration
o Ethical, Legal and Safety Issues
Field tested in fall, this tool aims to be a FORMATIVE tool that all districts can use to measure student capabilities. While the state or districts can pull summative data from the tool to be used for NCLB reporting purposes, its use in schools was designed to be fun for kids, visually pleasant, and useful for schools to determine where kids are at. The hope is to create a tool that can help identify priorities for creating equity, funding, and formative learning… but not an accountability tool. Since this project was funded by a NSF grant, any state can obtain their test free of charge. It is flash-based. One notion must be repeated: In the words of the state director, this test is “used not to produce test results or accountability, but to move the state forward.
“The interactive and performance based Student Tool for Technology Literacy (ST2L) is currently in the stages of creation. The traditional, research based procedures for instrument development are being followed. The team of developers consists of measurement experts who will build and evaluate items. The advisory group consists of education and technology experts through out the state that will be continually evaluating items during the development process. The expert review panel has been working closely with the development team to make final revisions and decisions on indicators.
“The tool will be able to gauge students’ existing level of technology skills. Teachers will be able to use the tool to gather data on students’ current level of technology proficiency. Other applications include using the tool as a pre and post test in combination with classroom experiences to guide students’ technology skill acquisition. The tool will be field tested near the end of the 2005/2006 school year, and it is anticipated to be available for use by all districts some time during the 2006/2007 school year.” (quoted material from “Tool” webpage, http://www.flinnovates.org/sttl/tool.htm
www.genyes.com/programs/techyes/ (accessed 6-11-08)
TechYES is an innovative way for schools and community organizations to offer a technology certification program to students in grades 6-9. As with all Generation YES products, students are at the center of the solution - backed up with solid research and extensive resources.
In TechYES, students show technology literacy by creating projects that meet state and local technology proficiency requirements. As part of TechYES, a structured peer-mentoring program assists the teacher or advisor, and provides student leadership opportunities that serve to further strengthen the program and enrich the learning community.
TechYES is a revolutionary program that provides middle school and after-school educators everything needed to offer students an authentic path to technology literacy certification. TechYES helps schools meet the ISTE NETS technology standards for students and satisfy the NCLB technology literacy requirement for eighth graders.
TechYES is highly flexible, allowing schools and community organizations to choose materials and practices that suit their specific needs. Program materials have been designed specifically for middle school students -- not watered down from adult vocational technology certifications.
o Technology classes
o Integration into core classes
o After-school programs
o Community organizations
o Clubs and homeschools
TechYES encourages all students to complete technology projects that are creative and personally involving. The projects can also meet requirements for core curriculum classes or community service. These projects are the basis for the TechYES evaluation and certification. The program includes all necessary resources: individual student guidebooks, customized teacher/advisor materials, handouts and resources, access to a fully interactive support website, and certificates of completion.
Based on extensive research, these materials are the basis for creating a self-sustaining program focusing on significant student leadership through peer mentoring. This student involvement, combined with a cost-effective, 3-stage certification process, moves all students towards technology competency even if a school cannot schedule a required technology class for all students.
Schools can offer the TechYES Student Technology Literacy Certification program as part of an existing technology class, integrated into any subject class, or after school.
KSDE Guidelines for 8th Grade Technology Literacy Assessment (9-26-06), published by Kansas State Department of Education; available at www.google.com/search?safe=vss&q=%22technology%20literacy%20assessment%22&domains=www.ksde.org&sitesearch=www.ksde.org (accessed 6-8-08).
KSDE developed state technology standards (located with in the Kansas Model Curricular Standards for Library Media & Technology) in 2006 based primarily on the NETS for Students. It is recommended that districts use this document as the primary resource and adapt it for developing benchmarks, indicators, and instructional activities at each grade at the local level.
The method of assessment used is determined at the local level. The assessment method can be:
o knowledge based (test)
o grades in a required 8th grade course
o performance based (checklist/rubric)
o e-portfolio based (collected over a period of years)
o project based
o combination of any of the above
These assessment methods can be used with each standard individually or clustered where it is appropriate. They can be done in content areas or they can be done as a stand-alone effort.
Sylvia Martinez - Generation YES
Looking for authentic assessment of kids being center of technology called “Tech YES.” Assessment is always the tail that wags the dog. Student guides the process (peer assessment), they use criteria that matches the ISTE NETS standards. Talks about sharing, writing, creativity and project-based collaboration skills. Students should be using real technology for a real purpose - personally meaningful. Authentic assessment is hard, takes time and teacher focus. Working in a number of states. There is only one way to perform assessment. Each school and grade may be different.
“The test means it’s over.” Technology literacy should open the doors, not indicate you are done.
Mia Murphy - NC Dept. of Juvenile Justice & Delinquency Prevention
Mia Murphy presentation
Kate Kemker -
Built their own
Student Tool for Technology Literacy then developed that mirrored the same process using NETS. Framework has five sections: essential operational skills, missed, missed, independent learning, independent ethical issues.
“While that term has no universal definition, the core idea could be boiled down to this: Technologically literate students not only know how to operate hardware and software—they can also analyze the information flowing through it, evaluate that digital content’s relative merit and relevance, and use it creatively and ethically in communicating with others.”
“Breaking new ground,
“Teachers who proctored the assessment noted that students seemed “very engaged” by the online TLA test. One responded that she would’ve loved to have had a video camera with her to document the phenomenon. Another mentioned how amazingly quiet the computer lab was during the assessment.
“When taking the online test, students interact with assessment content in ways that allow them to demonstrate their proficiencies. Often, they must perform actions via simulations, rather than pick answers from among multiple choices. Thus, students must be able to format a paragraph, apply a spreadsheet formula, or conduct a database search. And they must demonstrate durable skills via generic menus and commands, not through brand-specific memorized shortcuts.”
“Though there was not wide agreement on the scope of the word technology, all panelists did share an understanding of literacy as something significantly beyond basic skills. Ripley has a particular interest in the higher order thinking that high-tech tools make possible. Observing that schools now focus more on developing student computer skills to support learning across curricula rather than studying about computers, Ripley asks,
“‘So, can we usefully ponder what subject will replace IT, and why? My view is that the subject must change its outlook from training students predominantly in the skills and capabilities that arise from the existence of personal computers. Instead we should look for the subject IT to concern itself increasingly with the growing range of technologies (mobile devices, blogs, video). It should concern itself with the uses to which technology is put (work, leisure, recreation, purchase). And it should concern itself with the facilities (or capabilities) that those technologies provide to students and adults (voice, visual communication, decision making, choice, responsibility).’
“Ripley's assessments take place in the virtual world of Pepford, where work assignments arrive via e-mail and a "walled garden" provides a virtual world of Web sites and applications. Writes Ripley,
“’The use of simulations is a key development. A simulator provides the context within which authentic assessment tasks can be designed and delivered to students. It also facilitates the development of assessment tasks that invite students to combine a range of capabilities and skills. The combination of these two aspects enables us to assess higher order IT capabilities, such as choice or communication.
“’The test records and scores the
actions that the student takes while completing the test. For example, a higher
order capability for a 14-year-old student in
“Simulation, of course, is not the only possible method to assess technology literacy. As Pearson points out, "I first will disagree slightly with Mary's contention that assessment of higher-order thinking requires 'new' assessments. Assessments that get at the more complex aspects of student thinking already existin instruments that creatively use extended and open-ended response items, and in some portfolio techniques." He points to an assessment of design capability by the International Baccalaureate as an example. And Pearson observes that while portfolio assessments are often viewed as limited in terms of providing valid and reliable data in a high-stakes arena, they could face fewer problems "if the rubrics for evaluating them are carefully thought through and teachers/evaluators are trained on the rubrics' use." Honey's report also identifies an array of assessments for the 21st Century Skills other than ICT literacy.”
From iSkills Overview, Educational Testing Service, www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/?vgnextoid=159f0e3c27a85110VgnVCM10000022f95190RCRD&vgnextchannel=e5b2a79898a85110VgnVCM10000022f95190RCRD
ETS organized the 2001 International ICT Literacy Panel — an international group of leaders in education, business and government — to analyze issues and approaches to measuring ICT literacy. From this research, ETS partnered with a consortium of institutions of higher education to develop the iSkills™ assessment.
The iSkills assessment helps you ensure your students are ready for success in academia and the workforce. The iSkills assessment:
o measures your students’ ability to navigate, critically evaluate and make sense of the wealth of information available through digital technology — so you can make the necessary changes to narrow skill gaps
o is the only ICT literacy test that assesses critical thinking in the digital environment
o tests the range of ICT literacy skills aligned with nationally recognized Association of College and Research Libraries (ACRL) standards
o helps you identify where further curriculum development is needed so students have the ICT literacy skills they need to succeed
Who should use the iSkills assessment?
Any curriculum, department or resource library class can use the iSkills assessment to gain valuable information about student ICT literacy. The assessment is offered at 2 levels of difficulty to measure ICT literacy at different stages of a student’s academic career.
o Appropriate for students transitioning into 4 year college programs or completing their freshman or sophomore undergraduate studies
o Identifies the technical skills needed to complete entry-level coursework
o Appropriate for students transitioning to upper-level coursework or the workplace
o Designed with more challenging tasks to help rising juniors and institutions determine student readiness for advanced-level study
o Evaluates mastery of skills necessary for workplace success
“The Collegiate Learning Assessment (CLA) is an innovative approach to assessing your [postsecondary] institution’s contribution to student learning developed by CAE with the RAND Corporation. Our measures are designed to simulate complex, ambiguous situations that every successful college graduate may one day face. Life is not like a multiple choice test, with four or five simple choices for every problem. So we ask students to analyze complex material and provide written responses. The CLA measures are uniquely designed to test for reasoning and communications skills that most agree should be one outcome of a college education.
“Most CLA participants assess their institution cross-sectionally, testing a sample of first year students in the fall and a sample of seniors in the spring. You receive two reports, the first after fall testing that looks at how your entering class compares to other CLA participants (adjusted for SAT or ACT scores). Then after testing of seniors in the spring, you receive a full Institutional Report that evaluates your school's value-added on a comparative basis. Testing every year allows you to measure for effects of changes in curriculum or pedagogy.
“For additional information about the assessment, please review the CLA Brochure. We also encourage you to review an Annotated Sample Institutional Report, which presents excerpts of a sample institutional report and accompanying explanations. If you’d like to review a list of institutions that are currently participating in the assessment, please click here.
“Performance-based assessments are anchored in a number of psychometric assumptions different from those employed by common multiple-choice exams. As such, initiatives like the Collegiate Learning Assessment (CLA) represent a paradigm shift, the technical underpinnings of which remain unfamiliar to many faculty and institutional researchers. Please refer to the CLA Technical FAQs for more information about the development, scoring, and reliability of CLA tasks, as well as other frequently asked questions.”
Objective: To meet NCLB Part D, Section 2402 requirement that schools must assess the technology proficiency of all students before the student reaches the 8th grade.
This Module comprises a 60 question test (a pre-assessment and a post-assessment) mapped to ISTE's NETS-S standards. The tests are designed to help technology coordinators and instructors assess and report on the technology proficiency of 8th graders in a simple way, saving them time and effort.
The ISTE NETS-S are divided into the following six sections:
Standard 1: Basic Operations and Concepts
Standard 2: Social, Ethical, and Human Issues
Standard 3: Technology Productivity Tools
Standard 4: Technology Communications Tools
Standard 5: Technology Research Tools
Standard 6: Technology Problem-solving and Decision-making tools
The Test has different versions for Mac and Windows users and contains 10 questions from each of the above sections. Questions contain Multiple Choice, Matching, True False, and Performance Based (hot spots). We will create a testing site for you to enable you to conduct a pre-assessment for students at the start of the school year, plan instruction based on the results, then conduct a post assessment at the end of the school year. Your administrators can pull results whenever needed
The cost of the Test is as follows:
$5.00 per student if purchased in May 2008 (minimum 100 students)
$7.50 if purchased in June 2008 (minimum 100 students)
$10:00 after July 1st, 2008 (minimum 100 students)
Mala Chakravorty, Ph.D.
Senior Account Executive
( toll-free 800.393.4636 x 206
( direct 407.796.5206
ACRL Information Literacy Standards
ACRL Instruction Section, Objectives for Information Literacy Instruction by Academic Librarians
PORTAL SITES AND LISTS
Holmes, J. W. Tutorials Online. [http://faculty.washington.edu/jwholmes/tutorial.html]. May 1999.
Angeley, R., and Purdue, J. "Information Literacy: An Overview." Dialogue. [http://www.ac.wwu.edu/~dialogue/issue6.html]. May 2000.
Bruce, C. Seven Faces of Information Literacy in Higher Education. [http://www.fit.qut.edu.au/InfoSys/bruce/inflit/faces/faces1.htm]. 1997.
Bundy, A. Drowning in Information, Starved for Knowledge: Information Literacy, not Technology, is the Issue. [http://www.library.unisa.edu.au/PAPERS/drowning.htm]. February 2000.
National Research Council. Being Fluent with Information Technology. [http://books.nap.edu/html/beingfluent/]. 1999.
Oberman, C., Lindauer, B. G., and Wilson, B. Integrating Information Literacy into the Curriculum: How is Your Library Measuring Up? [http://www.ala.org/acrl/nili/integrtg.html]. August 2000.
Shapiro, J., and Hughes, S. K. Information Literacy as a Liberal Art [http://www.educause.edu/pub/er/review/reviewarticles/31231.html]. March/April 1996.
Stanford, L. "An Academician's Journey into Information Literacy." New Directions for Higher Education, 1992, 78.
Unitech. Towards Competency. [http://hobbes.unitecnology.ac.nz/competency/].