tranquileye > contextualizing the internet, 1995 > regulation of the net in Canada search

Regulation of the Internet in Canada:
Prospects and Problems

July 1995 ~ A July 19th editorial in the Montreal Gazette seemed to sum up the popular mood of the past two months concerning the Internet: “Cyberspace needs law and order.” What had been something of a two year media honeymoon with newly-discovered networking technologies had dissolved into a flurry of interest in what Maclean’s Magazine and Time called “cyberporn” and “Crime in Cybercity,” buzzwords indicating the level of illegality which the Internet seemed to be encouraging.

Legislators in both Canada and the United States were quick to latch onto this new, more negative media interest in the Internet. Private members’ bills were introduced in the Canadian House of Commons calling for regulation of the Internet to restrict access to certain sexually explicit materials (Cargata 50). Possibly the most far-reaching legislation was the brainchild of United States Senator James Exon, an amendment to the massive Telecommunications Act which would make it illegal for any telephone system or computer network to carry “obscene, lewd, lascivious, filthy or indecent” material (Chidley 58).

The reaction from the Internet community to these and other proposals was predictable and strongly oppositional. Regulation is, in many ways, antithetical to the philosophy of the Internet. Arising from the male-dominated “hacker culture” which Steve Levy so effectively describes in Hackers: Heroes of the Computer Revolution, a basic tenant of the Net has been the notion that “all information should be free” (Levy 40). The development of the Internet depended on the free exchange of technical information; a kind of energetic chaos which result in organization is seen as essential for the development of the network of networks.

Most popular and much of the academic writing on the Internet revels to some extent in these ideas; the Internet is celebrated as a communications space which was not planned and which can regulate itself. In fact, many would argue that self-regulation is the only means by which the Internet can be managed because of the technical nature of the system of decentralized and interconnected networks. Michael Martineau, the vice-president of an Internet service provider in Halifax recently stated that the “Internet regards censorship as a hardware problem and just works around it” (qt. in Cargata, 50). It is no coincidence that 25% of the members of the Libertarian Party of America come from the computer industry (Futrelle 20-22).

Given the Internet’s ability to provide quick and easy access to a variety of quasi-legal material, including content which could reasonably be considered to violate sections of the Criminal Code of Canada, some form of regulation seems inevitable. But can the network be effectively regulated from a technical perceptive? And given the technology involved, what will the legislative basis for regulation?

On June 12th, 1995, the Simon Wiesenthal Centre in Toronto submitted a brief to the Canadian Radio-television Telecommunications Commission (CRTC) asking that the Commission begin to regulate the Internet in the same manner that broadcasting is controlled (Rinehart). The Centre’s concern was around hate propaganda; if Internet providers were licensed as broadcasters are currently, they could be fined or shut down if they allowed access to such material. Reaction to this proposal was swift; Marita Moll of the Public Information Highway Advisory Council suggested that the model which should be followed was not that of broadcasting, where every provider of material must be licensed, but the telephone system, where the network is a common carrier and only producers are held responsible for what they distribute (Rinehart).

The Broadcasting and Telecommunications Acts present us with intriguing though parallel visions of how the Internet, and new broadband networks, might be regulated. As exemplified by the objectives below, the Broadcasting Act tends to emphasize the collective rights of Canadians and the responsibilities of those groups and individuals who receive licences to broadcast:

[The] Canadian broadcasting system should encourage the development of Canadian expression by providing a wide range of programming that reflects Canadian attitudes, opinions, ideas, values and artistic creativity, by displaying Canadian talent in entertainment programming and by offering information and analysis concerning Canada and other countries from a Canadian point of view ... The programming originated by broadcasting undertakings should be of high standard ... (Broadcasting Act, Section 3.[1][d][ii] and [g])

By contrast, the objectives of the Telecommunications Act emphasize individual rights:

Canadian telecommunications policy has as its objectives are the orderly development throughout Canada of a telecommunications system that serves to safeguard, enrich and strengthen the social and economic fabric of Canada and its regions [and] to foster increased reliance on market forces for the provision of telecommunications services and to ensure that regulation, where required, is efficient and effective... to respond to the economic and social requirements of users of telecommunications services; and to contribute to the protection of the privacy of persons. (Telecommunications Act, Section 7.[a] and [f][h][i])

The Criminal Code of Canada places limits on various kinds of free speech, including death threats, engaging in a conspiracy to commit a crime, hate literature, production and distribution of certain kinds of sexual material, and so on. These acts are illegal on the Internet as well. The question becomes, then, who is responsible when such an act is committed? Under the Broadcasting Act, the holder of a broadcasting license is responsible for all material which is presented on its broadcasting operation. If there is a complaint concerning abusive, racist statements on a radio station, the Commission dives the licensee, not the announcer who made the remarks, although some action against such a person is not out of the question under the Criminal Code or human rights legislation. The Telecommunications Act presents a somewhat different set of responsibilities; if the law is violated, say in the instance of abusive or threatening telephone calls, the telephone service provider is not held responsible in any way for the calls. Instead, law enforcement authorities charge the maker of the telephone calls.

What has the Internet community concerned, with its strong libertarian or anarchist tendencies, is that most legislative proposals, as immature as they are, tend toward a modified broadcasting model. The Exon amendment, for instance, which passed the United States Senate by a healthy margin but has yet to clear the House, would place the onus on the service provider to restrict access to material that is “obscene and indecent.” This raises the prospect of Internet service providers and university computer administrators doing global searches for certain words through our email or information requests, perhaps “bomb,” “murder,” “homosexual,” various sorts of profanity, and heaven knows what else; a chilling prospect.

Canadian lawmakers have yet to make a choice, though many service providers are already weary to present the entire range of material available on the Internet. For example, Odyssée Internet, a Montréal Internet access provider, does not carry certain Usenet conferences on its system. Banned is, while alt.binaries.erotica.children for some reason remains.

Of course, legislative model or hybrid which ultimately becomes the basis of Internet regulation in Canada is immaterial if it is technologically impossible for governments to meaningfully control access and content. For instance, it is quite simple for anyone with Usenet access to post and therefore distribute around the world any material they wish concerning any topic, containing any sort of language or claim; this is exactly what happened during the publication ban on information concerning the Karla Homolka manslaughter trial. While service providers and many universities were quick to restrict access to a conference established to discuss the trial (the ), trial information was often posted in the huge and well-read canada.general, and even to forums which usually focused on sports such as

The ease with which information can be distributed, and the difficulty of effectively restricting access to it, extends beyond Usenet to other aspects of the Internet as well. Without presenting too much of what has become the clichéd narrative of the Internet’s history, it is important to recognize those factors which have suggested to many individuals that the Internet cannot be regulated. The Internet, then know as ARPANET, was created in the late-1960s as a research project of the United States Department of Defence (Liu et. al. 1-4). There is some disagreement over whether the project began as an attempt to create a robust communications network, structured in such a way that it could survive a nuclear war, or simply as an experiment in new networking technologies. Regardless, the Internet uses a transfer technology called packet switching which demands that each computer on the network have a part in how information is transferred (10). Instead of utilizing a central hub which directs network traffic (a characteristic of local area networks), the Internet is essentially decentralized, and potentially each server on the might be called upon to handle data. Packets pass from one server to the next by the quickest and most efficient route possible given the circumstances. If a server which might normally be used to transfer information is unavailable, packets of data are rerouted around the inaccessible site.

At first glance, then, controlling these packets seems difficult. Let us suppose that a government wished to restrict its citizens’ access to certain Internet sites in other countries. Since there are potentially thousands, if not millions, of possible routes between each domestic Internet connected computer and the “restricted” foreign sites, plugging up access routes by various technical means seems difficult if not impossible.

In a similar vein, the administration of the Internet is not centralized in a particular location under a group of individuals. The Internet works because of formal and tacit agreements among its users to adhere to a suite of networking protocols called TCP/IP, named after two protocols within the suite, Transfer Control Protocol and Internet Protocol (Liu et. al. 8). Standards are arrived at and changes to the protocols made by usually open processes which involve a high degree of consensus. No one person or group seems to be in charge.

That is, at least, the theory, and it is a somewhat comfortable myth which Internet activists, system administers, and users would like to cling to; the reality is somewhat more difficult, and becoming more complex as time goes by and more and more users begin to use the Net. There is, in fact, real authority over the Internet, and this authority takes the form of individual system administrators (sysadmins) at each Internet server or access provider (Frisch 2-5). These sysadmins typically have a kind of absolute authority which seems rare in contemporary society. While law enforcement officials and legislators work under close supervision and within clear guidelines and laws, sysadmins have no such legal framework. To open your mail or eavesdrop on your telephone conversations, the police must get a court order. A sysadmin, however, can do the Internet equivalent of both these acts invisibly and with impunity. It is difficult to argue that the vast majority of these sysadmins are accountable to their users in any way; perhaps they answer to their employer or supervisor within an institutions, but this is rarely made obvious. There is control of the Internet, control which does not take the form of the tradition pyramid of increasing authority, but rather a diffused hierarchy of absolute sysadmin control of each site.

Let us return to our example above of access control of certain sites, and flesh it out. Imagine that accessing certain World Wide Web (WWW) sites was no longer permitted from Canada; a good example might be the five or six white supremacist servers in the United States. Blocking access could take place at several points. A sysadmin can block access from a server she controls simply by adding the page’s uniform resource locator (URL) in a special file on her computer. Users trying to access the page will instead see an error message telling them they are not allowed to see the page. This same type of control can also be applied at the user level; a commercial product called NetWatch allows parents to restrict their children's access to certain sites which they feel the children should not see (Chidley 58). At the broader networking level, where Internet connections cross physical boundaries, it is true that it is difficult to block access, but not impossible. Computers routing requests for access to certain, restricted sites could be instructed to turn back such requests; again, this is a decision of each individual sysadmin.

Most frightening of all, perhaps, is the potential for special automated programs, similar to the Internet Worm of 1988, to seek out and erase email, Usenet postings, and even disable access to certain sites. Every so often, a particularly annoying Usenet post will generate enough anger that a programmer will “mailbomb” the poster, effectively disabling the account. There is no reason to believe that governments, with considerably more resources than individual computer hackers, could create similar but more effective programs which could, for instance, disable “offensive” sites domestically and in other countries.

In June of 1995, I spoke as part of a panel at the Canadian Networking Conference (NET 95) in Ottawa on the prospects for Internet regulation. It is fair to say that it was a hurried and somewhat frustrated panel, with representatives from the Coalition for Public Information, Electronic Frontier Canada, and other organizations of a similar size and political bent expressing concern with what they see as a group of regulatory and advisory bodies which are not responsive to their concerns. They spoke, for the most part, with frustration over the process by which the Canadian Radio-television and Telecommunications Commission (CRTC) and Information Highway Advisory Council (IHAC) had arrived at their recommendations concerning the Internet and proposed new broadband networks.

The frustration is understandable, though ultimately misplaced and not very useful; the recent CRTC report on the possible regulatory environment for new broadband networks appears already out-of-date when attempting to div these immediate challenges of content regulation.

Instead, I spoke of the early history of radio, the period after World War I which saw an explosion of interest in the new medium and a large amateur radio hobby which allowed people around the world to communicate with each other. In many ways it is difficult to imagine a more decentralized medium then radio: broadcasting and receiving equipment was relatively cheap after the war, and broadcasting does not require that one be attached to a network of any kind which the telephone and telegraph demanded. However, in 1920 the British Imperial Communications Committee placed a two-year moratorium on all broadcasting in Britain, public, private, community and amateur. When broadcasting was allowed again in 1922, it was under firm control of a government-controlled consortium which became the British Broadcasting Corporation (Lewis & Pearlman 64-66). This seemingly robust, decentralized medium had somehow come under strict state control. As I said to the assembled business people and sysadmins, we cannot assume that either the technology or the culture of the Internet can “protect” it from government regulation and control, whatever form it may take.

At the beginning of this paper, I asked if the Internet could be effectively regulated given technological restrictions. The answer would seem to be yes, though the measures which would have to be taken to do so seem obviously expensive and, in many cases, a draconian invasion of individual rights to privacy. Regardless, government regulation can only do half the job. Healthy social interchange is essential to the operation and vitality of the Internet, and public policy must adapt to the dynamic nature of the network in order to keep it so. ~ John Stevenson


Canada. Broadcasting Act. Ottawa, 1991.

Canada. Telecommunications Act. Ottawa, 1993.

Canadian Radio-television Telecommunications Commission. “Competition and Culture on Canada’s Information Highway: Managing the Realities of Transition.” Ottawa: Public Works and Government Services Canada, 1995.

Caragata, Warren. “Crime in Cybercity.” Maclean’s 22 May 1995: 50-52.

Chidley, Joe. “Red-Light District: From S&M to bestiality; porn flourishes on the Internet.” Maclean’s 22 May 1995: 58.

“Cyberspace needs law and order.” Editorial. The Gazette 17 July 1995: B2.

Electronic Frontier Foundation. “Canadian Supreme Court Justice Offers Cautions on University Censorship of Electronic Speech.” Media release, 13 Dec 1994. Electronically distributed:

Frisch, Æleen. Essential System Administration. Sebastopol: O’Reilly & Associates, 1991.

Futrelle, David. “Libertarians on the March.” Utne Reader July-August 1995: 20-22.

Lewis, Peter M. and Corinne Pearlman. Media & Power: From Marconi to Murdoch, A Graphic Guide. London: Camden Press, 1986.

Levy, Steven. Hackers: Heroes of the Computer Revolution. New York: Doubleday, 1984.

Liu, Cricket, et. al. Managing Internet Information Services. Sebastopol: O’Reilly & Associates, 1994.

Rinehart, Dianne. “Regulation needed to stop Internet hate campaigns.” Canadian Press Newswire, 14 June 1995.

Shade, Leslie Regan. “Ethical issues in Computer Networking: Academic Freedom, Usenet, Censorship, and Freedom of Speech.” 23 Nov 1993. Electronically distributed:


Copyright © 1995, 1996, 2001 John Harris Stevenson,