In one corner are the Web 2.0 tools鈥攖he relatively new blogs, wikis, discussion forums, and social-networking sites that are gaining popularity among teachers looking to connect with their students and one another. By their very nature, such tools can be edited by a wide range of contributors, and they can host a wide range of content鈥攕ome of it educational, and some not so much.
In the opposite corner are the Web filters鈥攕oftware designed to block students from distracting or potentially harmful material, with roots in the more static online environment of the 1990s. In most cases, filters block whole websites rather than individual pages, based on a filtering company鈥檚 database of sites that contain questionable material.
While only the most laissez-faire technology advocates would favor scrapping filtering altogether鈥攁s well as the federal , or CIPA, which mandates it鈥攎ost realize something has to change if schools are to continue exploring the use of Web 2.0 tools.
鈥淔iltering cannot any longer be a block-or-allow hard decision as it used to be,鈥 says Rob Chambers, the chief technology officer for Bakersfield, Calif.-based Lightspeed Systems Inc., an education-only filtering company. 鈥淵ou have these sites that have good content and bad content, and it鈥檚 all together. [Filtering] has to take these things to mind because that鈥檚 what the world of education is heading toward.鈥
Some education technology officers are pushing for what is called dynamic filtering, which blocks content based on words, phrases, and ratings of images that appear on each Web page, meaning some pages on a site may be blocked while others are allowed. Others say the flexibility within standard filtering programs鈥攚hich do typically allow chief technology officers to select from a list of categories of sites to block or allow鈥攕hould suffice when combined with a functional school or district technology office.
And still others suggest that CIPA鈥檚 standards are narrow enough that, if strictly adhered to, incidents involving the blocking of Web 2.0 tools would be slim to nil. CIPA mandates that schools block material that is obscene, shows child pornography, or is potentially harmful to minors, with the last of those three eliciting a range of interpretations.
鈥淭here are times [educators] believe [the law is] much more prescriptive than it really is,鈥 says James Bosco, a project director for the Washington-based Consortium for School Networking鈥檚 Web 2.0 initiative. 鈥淏ut you do not want a situation when someone comes to a board meeting and says my kid came home and said the kid next to him was looking at pictures of naked girls.鈥
鈥淲e live with the specter of that occurring,鈥 he says. 鈥淭hat鈥檚 why some people say, 鈥楢re we really protecting kids [from harmful content], or ourselves from potential problems at school board meetings?鈥 鈥
At the 11,000-student Saugus Union School District, which has students in grades K-6 in the northern region of Los Angeles County, Calif., teachers and students have been using Web 2.0 tools since their inception, says Jim Klein, the district鈥檚 information services and technology director. The district has its own social-networking site, where students, teachers, and other staff members have their own spaces. And both students and staff regularly utilize other such sites for learning.
Guardians of the Web
Part of the district鈥檚 ability to do that, Klein says, is its use of an open-source filtering program called . Instead of checking websites against a list of forbidden sites, as most programs do, Dan鈥檚 Guardian, Klein says, searches individual pages for 鈥渉ot words鈥 that could signal improper content. If a page passes a certain threshold for hot words鈥攚hich can be adjusted depending on the level of filtering desired鈥攊t will be blocked.
Critics of Dan鈥檚 Guardian and other dynamic-filtering programs say the technology is imperfect in determining what is allowed and what is blocked.
But Klein says such claims are meant to obscure the fact that dynamic filtering theoretically eliminates the need for updates found in traditional programs like IGear and CyberPatrol, and therefore eliminates a recurring cost for districts.
Doug Anderson, the marketing director at Salt Lake City-based Content Watch LLC, says his company is one of the few commercial vendors to use primarily dynamic-filtering techniques built upon algorithms, both in personal and business products, including those serving school districts.. Content Watch鈥檚 ability to provide more-nuanced filtering around, for example, the difference between breast augmentation and breast cancer, can make the product more useful.
Klein acknowledges there is concern about how dynamic-filtering programs block images, but adds that most Web images are tagged with content ratings that the filter can read. Images that aren鈥檛, he says, usually appear alongside words and phrases that would cause a filter to block the Web page.
Barb Rose, the vice president of marketing for the Web-filtering company CyberPatrol LLC, based in Carlisle, Pa., says the risk is far greater than Klein indicates, in part because most search engines allow users to search for images independent of the Web page they are found on.
CyberPatrol鈥檚 software is capable of some dynamic filtering, she says, but in part because of concerns about speed, it is a rarely used feature. Instead, like most filtering software, CyberPatrol compares the site a user is trying to visit against websites that are listed in a database under one of several dozen categories. If the site matches a site listed in that category, and the chief technology officer has chosen that category as one that users in the district should not be viewing, the site is blocked.
Rose says CyberPatrol and its competitors are working to find more-refined approaches that take Web 2.0 tools into account. 鈥淚 think probably all the vendors are back under the hood looking at what they鈥檝e got to meet those needs,鈥 she says.
But she adds that blocking social networks is often good practice to protect network safety.
鈥淲e see more and more threats of viruses and malicious links鈥 on social-networking sites, Rose says, pointing to examples of Facebook viruses that masked themselves as a legitimate application.
Districts Set Priorities
In many cases, say educational technology experts, what exactly gets filtered has more to do with the district than with the product.
For example, David Jakes, the director of instructional technology for Glenbrook South High School in Glenview, Ill.鈥攁 northwestern suburb of Chicago鈥攕ays sometimes teachers are unable to access Web resources because they aren鈥檛 making the decisions about which categories to block and which to allow.
鈥淚n most districts, the technology people do the blocking,鈥 says Jakes, whose 5,000-student Glenbrook school district, made up of two high schools, blocks Facebook but allows access to most other sites. 鈥淚 think that鈥檚 a little misguided. There鈥檚 no discussion between the technology people and the curriculum people.鈥
The biggest teacher complaints are not always about whether they can access content, but how quickly they can access it. Chief technology officers generally have the power to override a blocked site that an instructor says has educational value. But depending on staffing, district procedure, and a teacher鈥檚 relationship with the technology department, the response time could range from minutes to close to a month.
Yet, while many filtering issues can be addressed internally, some experts feel the original point of filtering as mandated by federal law is consistently misrepresented.
Even Karen Cator, the chief of the U.S. Education Department鈥檚 office of educational technology, says that 鈥減eople have very different ways of interpreting鈥 CIPA, and Federal Communications Commission officials say the number of schools actually found in violation of the act is minute. Most violations that occur, they say, relate to the proper procedures for establishing filters, not the exposure of students to improper content.
Knowing that, some ed-tech experts say schools should not fear to take a more hands-off approach.
Klein of the Saugus Union district, for example, says that filters are ineffective at stopping a student from purposefully accessing improper content. Instead, he argues, technology directors should accept that a few students are generally going to succeed in circumventing the system, and thus should establish filter settings that help other students learn how to sift between dangerous and useful content.
鈥淚鈥檓 waiting for the day when a school district gets sued by a parent for not teaching kids to be responsible online,鈥 Klein says. 鈥淚t鈥檚 going to happen sooner or later.鈥