Βι¶ΉΤΌΕΔ

bbc.co.uk Navigation

Darren Waters

Did the internet get away with it?

  • Darren Waters
  • 28 Mar 08, 11:35 GMT

Video games have been the focus of pretty much all of the coverage of the , including our own.

The reason is simple - when it comes to legislation, only the games industry is being affected.

Halo 3For the internet, specifically virtual worlds, social networks and video sharing sites Dr Tanya's emphasis is on education and awareness.

As I've said in a previous report this is because regulating the net and the morass of user generated content on it is a task no government in its right mind would want to tackle, because it is a geo-political nightmare.

And if you read the official statements and reaction from different quarters of the internet industry all of them are universally applauding the review.

Why? Because they know they have been let off the hook, so to speak.

There's no legislation, no hint of regulation, no potential fines, no requirements to implement technological change.

In fact the online industries can go on exactly as they have done before, as long as they sign up to a few awareness campaigns and keep on promoting the safety features all of them say are already in place.

Here's a few of their responses:


Bebo welcomes the recommendations laid out in the Byron review. It represents a significant step forward and reflects and formalises the collaborative approach and shared responsibility taken by Bebo and industry already with government and other relevant stakeholders. The review sets out realistic timings and goals to ensure that internet safety standards continually improve.

The Internet Services Providers’ Association (ISPA UK) – the UK’s leading Internet trade association - is delighted to see that the key recommendations given in its submission to the Byron Review in November have been acknowledged and that Dr Byron recognises the complexity of the issues.
Facebook recognises the need to support parents and teachers in negotiating and understanding the online world that our children are growing up in and to provide practical advice on how users can replicate their offline controls online. User privacy has always been important to Facebook and the technology has been designed to replicate real-world connections online, with the ability to select personal privacy settings and provide complete user control.
Google is deeply committed to protecting children on the Internet and providing all of our users with a safe experience online, through empowerment, education, and protective measures. We have developed technological solutions, like the Google SafeSearch feature to letting users "flag" inappropriate content on YouTube; we have collaborated with child safety organizations, like BeatBullying, Childnet, the Child Exploitation & Online Protection Centre (CEOP), and the Internet Watch Foundation (IWF), to educate users about safe internet usage; and we work closely with law enforcement authorities around the world to prevent child exploitation.

Do you notice a theme? It's basically the industry saying: "We're doing all we can but happy to co-operate in any way possible."

This is not an entirely unreasonable position. After all these websites and association do have many, many measures in place to try and ensure safety.

The problem is that children are ignoring them and parents are not enforcing them.

The government doesn't want to be the online nanny and Internet Service Providers and social network definitely don't want to be given responsibility for what their customers get up to online.

So it's as you were for the online world.

Comments

  • 1.
  • At 02:00 PM on 28 Mar 2008,
  • Gerard Dahan wrote:

"The problem is that children are ignoring them and parents are not enforcing them." Problem ? If this is consistent enough to be noticed, it is more a choice than a problem. Which tends to prove that the question Dr Byron asked at the onset of her call for evidence ("How can we make the internet a better, safer place for children ?") was not the right question to ask. The right question can only be on how we adapt to freedom, not on how or why we should curtail it.

  • 2.
  • At 05:53 PM on 28 Mar 2008,
  • Gavin Hartshorn wrote:

I wonder from the content of this blog wether the writer believes that government should censor the internet..
My concern is do we want the internet to be censored when surely it is up to all of us as users to decide what we look at and what we believe or say .
The argument that children may see is pointless as children who's access is properly monitored by parents (I am a parent) and responsible adults will not have access to unsuitable material any more than they did when i was a teenager before we had internet access.
If we censor the net, the last place you can say and see anything that is available, we end up in a sterile and authoritarian world where the majority may feel safe but the minority know they are not.
Far better in my opinion to have the internet censored by the productive use of the off button or back button on your browser.

"Regulating the net" is a minefield, because it crosses borders with ease. The government can regulate the content on store shelves because those are physical.

As soon as it starts trying to determine what's "acceptable" online, it walks right into hypocrisy.

Content might be perfectly legal in one country, but the government decide it "inappropriate" in the UK. So, under the "regulation" model, what can it do?

And, what would make it any different from China, or Iran, or KSA for doing it?

China blacked out Βι¶ΉΤΌΕΔ news reports on Tibet - but wasn't that, too, one country deciding content was "inappropriate" and censoring it?

Hopefully, it's this kind of parallel that (partially) drove the Byron report to "ignore" the 'Net.

Trying to enforce one country's "standards" on content that originates (and is hosted) elsewhere ends up no different than the totalitarian censorship some countries already practice (and are regularly and perhaps rightly castigated for doing so).

It might be one country's government, it might even be an elected government - but as soon as it tries to enforce its will and "standards" across national boundaries, it's China, Iran, KSA, Pakistan blocking YouTube, et al, just with a prettier face.

  • 4.
  • At 09:13 PM on 28 Mar 2008,
  • Mike wrote:

Any attempt at trying to educate kids about the internet in school will end the same way drug awareness has. Ecstasy being referred to as 'disco biscuits' to a generation with more understanding of something than the teachers who are trying to tell them about it. This will result in taking even less notice to the warnings, as the facts will be mixed up with lies attempting to protect people from doing 'harmful' things.

  • 5.
  • At 08:41 AM on 29 Mar 2008,
  • Richard wrote:

Internet sites should actively police and moderate user-generated content far more than they currently do, they've got off very lightly in this report.

Videogames have unfairly taken the brunt.

  • 6.
  • At 09:17 AM on 29 Mar 2008,
  • SirDagard wrote:

The thing that frustrates me most about the reporting regarding the rating of video games is that it completely ignores the fact that the games publishers already use a rating system on their games.

The fact is that, the same as the online situation, children ignore these and parents so misunderstand what games are that they don't enforce it. Most parent are under the assumption that games are for children, when industry profiling shows that most gamers are adults (mostly male). Films still suffer the same problem, but it is now largely accepted that the ratings system is un-enforcable. How many children do you know who have seen a range of 15 and 18 rated movies?

The government, regulatory bodies and industry cannot protect our children for us. We must be responsible for this ourselves and education is the key to this.

  • 7.
  • At 10:29 AM on 29 Mar 2008,
  • Mark Mekhaiel wrote:

Replying to the first point made regarding internatinoal codes and the trouble 'legislating the net', the fact that it might be troublesome does not mean it should not be done.

When we see gangs in Derby being arrested for organised gang fights in the UK, this is clearly a domestic issue. However what happens when, for example, a Christian posts cartoons of a contentious nature online in Denmark, for them to be viewed in Iran?

What happens is that global protests are sparked, along with genuine damage being inflicted on the county's relationship with others, not to mention the physical backlash that is sometimes observed.

I do not see the problem with countries sensoring the net as oppose to sensoring books and T.V.

It is solely because the internet is seen as the most liberal and open source of information in the world, that sensoring it is seen as such an abhorrent act.

When the Βι¶ΉΤΌΕΔ decides not to show, for example, a cartoon involving our Queen dying, this is not met with national outrate at our information being limited, this is seen (by most at least) as a reasonable step to protect public decency.

Surely in a country where an image could cause huge offence (apologies for constantly referring to cartoons however it gets accross the point extremely well) it would be better to prevent millions of people being offended, rather than 'promoting freedom of information'?

I for one see sensoring the internet as no larger a crime than deciding which books your people can read, deciding on their religion, or preventing an offensive TV show from being broadcast.

Please do not confuse ALL censorship with dictatorship as it does have it's place in an open society.

  • 8.
  • At 06:20 PM on 29 Mar 2008,
  • Neil wrote:

I believe Mark Mekhaiel's logic is flawed. Television and the web are fundamentally different media - television is broadcast at us, and we can't govern what we see beyond choosing the channel.

The web on the other hand requires a user to actively search for then look at content (in the vast majority of cases). If you don't want to see something distasteful, you don't look at it, plain and simple.

  • 9.
  • At 06:27 PM on 29 Mar 2008,
  • gamer wrote:

A popular point that seems to be coming up is parenting, i could not agree with this more, you wouldnt buy your 10 year old child a copy of pulp fiction to watch before bed, then why buy Grand Theft Auto or the like?

  • 10.
  • At 07:29 PM on 29 Mar 2008,
  • James Hill wrote:

To Mark:

"I for one see sensoring the internet as no larger a crime than deciding which books your people can read, deciding on their religion, or preventing an offensive TV show from being broadcast."

The first two examples in this are not comparable to the last. I do not want an 'authority' deciding which book I read, or what religion I worship, but I don't care if they don't show an offensive show on TV. However, I would not be happy with the government telling me that I could not watch that TV program if I manage to get hold of it via import or, perhaps, via the internet.

You might accidentally turn over to an offensive TV program, but it is far more difficult to run into the 'offensive' online if you are not actively looking for it.

Censorship is nearly always an attempt by one community to impose its own vision of reality over another; censorship is an authoritarian response to the 'offensive'. It's far better for people to learn to cope with being 'offended' and to learn to live knowing that there are people who think in a completely different and abhorent way to them.

The possibility of being offended is the price we pay for the freedom to speak our own opinions. If everything potentially offensive was to be censored would we all be reduced to discussing the weather?

  • 11.
  • At 08:40 PM on 29 Mar 2008,
  • Clive wrote:

While the industry has made some effort, many parents and other`s saying they could do more well so could a lot of parents. Well it is so easy to blame someone else when you will not take some responsibility for your own actions or your child's action. Well parents it many of you that are at fault for playing dumb to long, go on a course just find out how the PC and the Internet work and stop blaming someone else for once

  • 12.
  • At 02:52 AM on 30 Mar 2008,
  • wrote:

Mark @ 5

I can understand your point - some content *does* exploit the Internet, whether it be propaganda by terrorists, pro-anorexia sites, pro-suicide sites, right wing neo-nazi sites, or any of a whole horde of sites whose content I'd consider abhorrent.

The problem is - who decides what's right and what's wrong? Censorship is securing borders against information and content that for some reason violates the law inside those borders - Should, then, YouTube be shut down because it contains content offensive to people in Pakistan?

Once you start applying domestic standards to the international environment, you hit the eternal axiom: You can't please all of the people all of the time.

So, who decides? If you say content should be censored across international boundaries, then everyone has the same right to consider content actionable as everyone else - and that will end up with no content being "acceptable".

As Gavin in @1 says, censorship should begin "at home". No-one is forcing people to view material online - it's not like having a picture of Lenin or Mao or Castro posted on every wall, and you're required to view sites with "objectionable content" - in that, it's not very dissi,milar to TV censorship - people have to positively watch, read, listen.

Personal control, accountability, and responsibility may have been abrogated to the idiot box in the corner of the room, or a CRT (Goddess, I feel old saying that :P), but that isn't a valid excuse to exercise censorship outside someone's borders.

I've never actually seen the Danish cartoons, I find them abhorrent and provocative and add about as much to global awareness as my blog does (i.e. none at all) - I also haven't seen that blonde dutch guy's video, for the same reasons.

Do I think they should be banned because I find them objectionable? Yes. Do I agree with them *being* banned? No.

Because if they were, how long before someone decides the content on my site is objectionable to them, and bans that too?

Censorship is a *very* subjective standard.

  • 13.
  • At 03:40 AM on 30 Mar 2008,
  • wrote:

Mark @ 5

I can understand your point - some content *does* exploit the Internet, whether it be propaganda by terrorists, pro-anorexia sites, pro-suicide sites, right wing neo-nazi sites, or any of a whole horde of sites whose content I'd consider abhorrent.

The problem is - who decides what's right and what's wrong? Censorship is securing borders against information and content that for some reason violates the law inside those borders - Should, then, YouTube be shut down because it contains content offensive to people in Pakistan?

Once you start applying domestic standards to the international environment, you hit the eternal axiom: You can't please all of the people all of the time.

So, who decides? If you say content should be censored across international boundaries, then everyone has the same right to consider content actionable as everyone else - and that will end up with no content being "acceptable".

As Gavin in @1 says, censorship should begin "at home". No-one is forcing people to view material online - it's not like having a picture of Lenin or Mao or Castro posted on every wall, and you're required to view sites with "objectionable content" - in that, it's not very dissi,milar to TV censorship - people have to positively watch, read, listen.

Personal control, accountability, and responsibility may have been abrogated to the idiot box in the corner of the room, or a CRT (Goddess, I feel old saying that :P), but that isn't a valid excuse to exercise censorship outside someone's borders.

I've never actually seen the Danish cartoons, I find them abhorrent and provocative and add about as much to global awareness as my blog does (i.e. none at all) - I also haven't seen that blonde dutch guy's video, for the same reasons.

Do I think they should be banned because I find them objectionable? Yes. Do I agree with them *being* banned? No.

Because if they were, how long before someone decides the content on my site is objectionable to them, and bans that too?

Censorship is a *very* subjective standard.

  • 14.
  • At 04:07 AM on 30 Mar 2008,
  • Alexander Thomas wrote:

Re: "Surely in a country where an image could cause huge offence (apologies for constantly referring to cartoons however it gets accross the point extremely well) it would be better to prevent millions of people being offended, rather than 'promoting freedom of information'?" - Mark Mekhaiel

This is not such a country, and over my dead body will it ever become one. Censorship laws should never be applied to content that simply 'offends', and you should be ashamed of yourself for suggesting it.

Perhaps, as well censoring all imagery depicting the Prophet Mohamed, we should also protect flat-Earthers from the implication that the Earth is round. Let's also not point out that condoms help prevent the spread of AIDS. Oh, and let's not forget that Adam and Eve shared the Garden of Eden with dinosaurs. Appeasing ridiculous beliefs is fun, isn't it?

Maybe, however, it would be far more logical to remind ourselves that people choose their beliefs. Then we can come to the realization that if someone is offended by something that runs contrary to their beliefs then that's their own fault, their own choice, and hence their own problem.

  • 15.
  • At 10:41 AM on 30 Mar 2008,
  • wrote:

Another well-crafted article by Darren!

The primary issue with the Internet, it that it has grown far faster than our ability to 'cope' with its threats.

It always seems like the ISP's, governments and interest groups are playing catch-up.

Thanks again for a superb article Darren.

  • 16.
  • At 08:40 PM on 30 Mar 2008,
  • John wrote:

The ignorance of the parents are at fault here, not the children who are playing these games and going online. Just because parents don't understand how or what there children are on, doesn't mean that we should "blanket" the situation like we do with so many of our problems these days. It's the parents who need to be educated and informed, not the gaming industry.

  • 17.
  • At 08:42 PM on 30 Mar 2008,
  • Thomas Goodey wrote:

"Surely in a country where an image could cause huge offence (apologies for constantly referring to cartoons however it gets accross the point extremely well) it would be better to prevent millions of people being offended, rather than 'promoting freedom of information'?"

No Mark, it wouldn't, no. I am afraid you don't understand the basic point at all.

  • 18.
  • At 11:33 PM on 30 Mar 2008,
  • Alex Blackmore wrote:

Why are there so many people slating this man for wanting to censor the internet when he never said that? Did you even read the article? He is merely pointing out that the games industry appears to have taken the brunt of the Byron report whereas the internet has merely gotten a 'slap on the wrist'. Essentially this is true, have any of you even read the Byron report?

I suggest you do so before insulting this man for what he has to say, I see no inference that he wishes to have the internet censored, merely that he believes the games industry got the rough deal from the Byron report.

  • 19.
  • At 03:00 AM on 31 Mar 2008,
  • Chris wrote:

How do you impose laws which can not be policed? The restricting of internet sites or other forms of censorship will make no real difference now due to the sheer size of the internet and the different ways to access it (short of obviously shutting of the power to all computers!). Thus for the govenment to try to ban such things only makes for further mockery and derision of the legal system, too full of such unpolicable laws.
The onus therefore must rest in the lap of the user to dictate their tastes or in the parents to restrict access for the children until such a time as deemed suitable (or you reach 18 !).
Of course one could hope for a radical shift in the moral standards of the general populus but that might be hoping for a bit too much.

  • 20.
  • At 06:48 AM on 31 Mar 2008,
  • Stephen wrote:

The majority of children on the internet are exposed to porn, by malicious porn companies targeting children using subversive tactics, before they ever search for it themselves.

  • 21.
  • At 11:22 AM on 31 Mar 2008,
  • AC wrote:

This subject is a double edged sword. It's extremely easy to have a database of acceptable sites, and unacceptable sites. I'm not 100% sure, but I think that's how parental control software works. They basically block access to certain sites. Filters are a bit more complex these days, but the general idea is that there is a filter.

The question is, who would control such a filter? It looks like we are all against censorship, but we all want the Internet to be more secure for our kids, and to a certain extent, secure for us as well.

I don't have kids yet, even if I did, I wouldn't like to control everything they do. Why should I be a human filter, when we can easily do it in software? There might be a moment that I step outside the room and they go to a website or say something that they shouldn't be doing. Parental control is not as effective as we are made to think it is. We basically put the trust in our kids. If they are so accountable, why do we need to monitor their actions in the first place?

I would rather have a parental control filter, and not think about it rather then sit in front of the computer with kids. On my computer, I object having any filters. I'm a responsible adult, I should be responsible and accountable for my actions. If someone is offended by my actions, then tough luck. As long as I do not break the law, I don't care if they are offended or not.

  • 22.
  • At 11:48 AM on 31 Mar 2008,
  • wrote:

Do videogames companies have to pay for PEGI classification? As I recall it costs Β£2000 to submit a film to the BBFC. If they take over games classification that's lots of small, independant games companies that will not be able to afford to sell their wares in this country.

I also like how the Brown report puts the onus on the industry to educate parents, not for parents to make the effort to learn. Because blaming parents or making them be resposible is such a no-no in this country...

  • 23.
  • At 01:15 PM on 31 Mar 2008,
  • dan wrote:

Children need to be protected. But parents don't have the time to conduct 24hr checking. We need to invent systems to prevent the under age from seeing the wrong content. Maybe National Insurance nos could be used as a protection code and if this is not secure enough, finger print or retina scanning for the future.

This post is closed to new comments.

The Βι¶ΉΤΌΕΔ is not responsible for the content of external internet sites

Βι¶ΉΤΌΕΔ.co.uk