ΒιΆΉΤΌΕΔ

bbc.co.uk Navigation

Rory Cellan-Jones

Facebook v Ceop

  • Rory Cellan-Jones
  • 18 Nov 09, 11:25 GMT

, accused of neglecting its responsibility to help to keep young internet users safe.

The charge comes from Jim Gamble of , who wants Facebook (and MySpace) to follow the lead of Bebo in including in its social network.

Mr Gamble says he doesn't understand why Facebook won't take a fairly simple step which would give young users instant access to advice on issues from bullying to viruses and hacking, and would put them in touch with the police if they so wished.

In order to see this content you need to have both Javascript enabled and Flash installed. Visit ΒιΆΉΤΌΕΔ Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.


Now, Facebook knew this attack was coming and gave the ΒιΆΉΤΌΕΔ a fairly comprehensive statement, explaining why the service is not keen on the Ceop button.

It says that it's tried out such systems on a number of occasions, and that they've proved ineffective, actually decreasing the number of abuse reports. It points out that it's an international site and would prefer to have rather than a separate one in each territory. And, in what appears to be a jibe aimed at Bebo, it says:

"We are confident that the Ceop button is an excellent solution for sites that have not invested in as robust a reporting infrastructure as Facebook has in place and one we continue to improve."

The social network - which earlier this month - might appear to have quite a coherent case. But what Facebook has not done at the time of writing is to - allowing Jim Gamble to more convincingly argue that a company that won't debate the issue can't have much of a case.

In order to see this content you need to have both Javascript enabled and Flash installed. Visit ΒιΆΉΤΌΕΔ Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.


We should also remember that, as various callers to a recent Nicky Campbell phone-in pointed out, there is an age-limit of 13 for Facebook, and there is a duty on parents to monitor the way their children use social networks.

But Facebook appears to have decided that it has nothing to gain by tangling live on air with a respected figure like Jim Gamble.

Facebook is of course a global business, based in California - but it is also now a major British media company, earning plenty of advertising revenue here and having a big effect on millions of lives. With that kind of power, some are asking: does it have a responsibility to answer its critics - especially when it believes that their criticisms are wrong-headed?

Comments

  • Comment number 1.

    Having a standard button on reputable websites is a good idea, it's just the unreputable ones that are the problem. Having, and promoting, a standard Ceop button would be a good idea if there was a guarenteed way of getting to that website irrespective of the source then this would be a good thing.
    Unfortunatly it is all too easy to set up fake sites, and we all see them everyday in our Spam folders. Having a standard button is certainly going to give online predators a bigger target to aim at and any successful "capture" is going to be a valuable vulnerable child. There are a number of different ways of faking this and delivering a fake button that could deliver vulnerable children direct into the hands of online predators.
    Rather than having a recognisable button to click on it is probably better to just educate the children about Ceop and the problems of online predators.

  • Comment number 2.

    Jim Gamble came over as a bully himself on Radio 4 this morning. Why should he be in a position to demand what is basically a free advertising link to his organisation when the site is multi national and operates in a number of different social and legal environments.

    The "button" is just a simple link to
    Jim could pay for an advert for that if he wanted to.

    There are many ways to tackle this and other issues, quangos shouldn't expect or be given the right to impose their chosen solution. Similar things happened with the IWF and with food labelling "traffic lights" - other solutions, opinions and agencies are available.

    I would assume that Facebook et al are better placed to know who is connecting to their site from where and therefore be able to give a more productive first response to a user via their existing "report" features.

  • Comment number 3.

    Bebo has always been aimed at children, but Facebook was originally for adults and as you point out, has a minimum age of 13. The issue of parental responsibility is also mentioned - the internet/world-wide-web was created by adults for use by adults and thus, as such, any responsible parent should be supervising what their child is doing on it very carefully.

    It seems that if you shout "think of the children" loud enough and long enough, nobody dares to suggest that you might be wrong, for fear of being branded a paedophile. Why would Facebook want to engage in a hiding to nothing, when they are already happy that they are being as responsible as they can?

    Facebook is NOT responsible for your children. You are.

  • Comment number 4.

    Education is the only way to aid people in becoming less vulnerable.
    All the grooming is designed to not raise alarm bells, or make people not wish to raise them themselves.
    With bullying, the best people for children to talk to are their parents.
    Some 'Magic button' will not help. Truly bullied people (speaking as one myself from my school days; still left with a nervous disposition to this day) will not reach out to the police. It's hard enough to talk to parents..
    Parenting is a hard job, but this is part of it; recognising that something's awry with the child, checking up (in a non-intrusive way wherever possible), sharing anecdotes to bring in the connection on a subject to let a child open up and feel free to talk with no comeback.. But teaching them to do what's necessary for their own safety is the only truly safe way. Safety cannot be imposed from outside.

  • Comment number 5.

    What nonsense. No children ever come to any physical harm online. Harm only ever results if children carry over their online experiences into the real world.

    And that, surely, is what parents are for: to make sure that their kids know not to take sweets from strangers, and not to arrange to meet strangers in the park at midnight. All this fuss about Ceop is yet another example of a world in which no-one wants to take any personal responsibility.

  • Comment number 6.

    I agree with cynicaleng's comment about how Jim Gamble came across in the Radio reports this morning. I was horrified to hear how it was presented as it came across as pure bullying on an "anti-bullying" ΒιΆΉΤΌΕΔ report... It reminded me of the way relatively small groups can use the threat of public opinion to get their way, which is not exactly something we need to be teaching our kids (just shout loudly over others, tell everyone else they are wrong and you are right, and you'll get your way). It also totally undermines the good work that Ceop are doing.

    Right or wrong, facebook have their reasons behind it, whether it's commercial or ethical or technical. If you (as a parent) don't like that they don't support this method of reporting, exert some parental control and explain to your kids why you only want them on the other site. Educate them. Let facebook, etc, decide that their userbase wants this so they will make a commercial decision. It's how we do things in a free economy.

    It is also worrying as I've heard recently more and more teens complaining about being bullied by "report happies". Kids bulling them by reporting them to various sites (facebook, neopets, etc) making allegations and getting their accounts suspended temporarily as a way to victimise them. We need some sanity here and teach our kids responsibility for their own actions, whether it's the impact of bullying others or how to find help if they are feeling pressured. Most of all, we have to teach our kids how to speak to us.



  • Comment number 7.

    Am bemsued by some of the comments here. Surely everyone agress that here is a very real step forward in promoting child saftey online.
    Whilst of course parents have a duty of care it is impossible to monitor online activity of your children constantly.
    Jim Gamble did NOT come across as a Bully - he came across as an eloquent individual who understands the dangers and is trying to promote a solution which ALL social netowking sites should readily sign up to.

  • Comment number 8.

    Sounds like Jim Gamble is trying to make a name for himself, he has an agenda that's there and the vocal minority Daily Mail readers will of course support it blindly.

    I don't really see the point of it though, if he's trying to protect all children, all the time, then why not encourage all software developers who develop browsers to have some kind of function built in to report these kind of things, not just a couple of token (admittedly large) websites...

  • Comment number 9.

    Surely he should promote rather than specifically CEOP? This is a pan-national effort.

  • Comment number 10.

    Of course, CEOP owns the VGT, and on the VGT website ( there is a link to help find Madelaine McCain. Unfortunately, CEOP has been careless and let this domain lapse and it now links to sex webcams!
    There is no contact information on the VGT site to alert them to their error.

    On reflection is this the sort of organisation Facebook should be promoting? And is this quango a good use of taxpayer money?

  • Comment number 11.

    Facebook already has its own reporting system and they are in a much better position to judge as they have access to everything posted via their service, something that no third party should ever be allowed access to.

    This CEOP organisation has no right to assume they are the best guardians and even less right to accuse thos who choose other methods of negligence. They have neither the resources of facebook to investigate these reports on a worldwide basis (and it would have to be worldwide for consistency's sake) nor the authority to force any other company to act on it's findings.

    The other thing noone has mentioned yet is that the button could be used to bully in itself, a large number of abuse claims every year throughout the world are withdrawn or proven to be false once investigate (I'm not trying to say that MOST are false in any way, but the percentage is large enough to indicate an issue). I would sooner trust facebook who have access to all the information to come to the correct finding and to assume innocence until proven guilty than CEOP, if only because as a business facebook has a need to come to the right conclusions about things in order to maximise it's income and reduce bad PR whereas CEOP do not.

  • Comment number 12.

    Gamble just sounded like a bully himself in that interview.
    Personally, I'd just ad-block his adverts anyway.

    Online responsibility for children lies with the parents, end of discussion.

  • Comment number 13.

    Having seen the 'bullying' that occurs on child specific websites... where false accusations made by one child against another to get them suspended/barred... unable to play the games they enjoy... and it IS bullying, just one step removed from the playground, 'You can't play with us, we want that toy, if you don't do what I say, I'm going to tell Sir' etc etc... I really cannot see the point in yet another business being allowed to 'force' it's way onto another organisations' fairly well self-policed site thereby, increasing its public profile and income... under the pretence of 'child protection'

  • Comment number 14.

    Surely everyone agress that here is a very real step forward in promoting child saftey online

    I don't, and it doesn't sound like many other people here do either. If you are, then perhaps you could explain why you're so sure that Jim Gamble's self-aggrandizing efforts are better than Facebook's existing systems

  • Comment number 15.

    I’m on the side of FB on this.
    In cases of reported bullying Facebook claim that β€œFacebook staff analyse all reports of unsuitable content on the social networking site. Much of it is removed immediately while some users receive a warning.”

    That’s more direct and effective than the CEOP response; if you click CEOPs button (there’s a demo on their website), then on Bullying, all you get is the ChildLine phone number.

    I would assume that if a user reported to FB β€˜inappropriate’ mails, comments or material then the FB centre could involve the police or IWF if they thought necessary?

    I do think that CEOP has a useful role to play; the emailings of β€˜most wanted’ sex offenders being sought by the police for example,

    but it is not the only way of safeguarding young people.

    It would be better if CEOP organised awareness sessions etc for the staff of the FB centre that deal with complaints and built bridges with other sites rather than demand that they be seen as the only 'solution'.

  • Comment number 16.

    A subject as important as child protection.

    And what does Facebook come up with? A cheap jibe at a competitor.

    So glad to see they take the subject seriously.

    Although FB is meant to have a minimum age of 13, I notice that a lot of the apps seemed to be aimed one hell of a lot younger - certainly visually. Consequently, parents will often let younger children use it, despite the 13 restriction, because in reality it looks like it caters for younger children.

    Maybe I am too cynical from being in the media for 30 years, but this is like the video games that are aimed for people over 18 - yeah, right! Sure they are. That is why they do action figure merchandising to go along with them.

    Sites like Facebook WILL get used by younger people. FB are very well aware of this and should take responsibility rather than hide behind their terms and conditions.

  • Comment number 17.

    Gurubear. Assuming you have also read my previous post (15).
    I agree with you about the importance of child protection; I would have concerns if FB was doing nothing, but they have set up a centre dedicated to dealing with complaints (more than Bebo seem to have done – hence their just adding a CEOPs button as an easy way out).

    FBs response to Bulling can be more direct (removal and warnings) than CEOP's giving of an [already busy] telephone helpline number.

    There’s more than one way of providing child protection; many local authorities have in-house services rather than using a distant national service for example. What is important is bridge building and establishing working relationships between the different organisations.

    As for the all the brightly coloured apps, sadly (sigh) many of my friends on FB (all in their mid 30s – mid 50s) seem to enjoy playing these.

    It is possible youngster lie about their age to gain access, but short of FB/Bebo/MySpace etc monitoring all mails/comments/posts etc (think of the outcry over invasion of privacy from users if that happened - plus the logistical nightmare given hundreds of millions of users) I’m not sure how that can be stopped.

    The argument that parents should be aware of what sites their child accesses does have some force; many adults do use the web everyday at work and are quite computer literate; its not true that β€˜the kids know more that their parents’; some will, but by no means all.

  • Comment number 18.

    I'm with facebook on this one. It's their site, their rules. We know what'll happen if it get too kid-friendly - the adults on the site will move elsewhere. Then, the kids will follow as the new site that all the adults are using seems cool, and so we're back where we started, only facebook's shareholders lost out.

    Ohh and gurubear - those games ARE aimed at the 18-30's. Most online games are (even if they don't state it), "pro" players will often restrict access to clans/guilds to 18+. This isn't because they don't want little kids to play online games, but because under 18's tend not to have control over their own lives, let alone their own computer and internet connection. So kids might try to play, but they'll be restricted by age by other players anyway. An annoying 10-year-old will quickly get booted from a privately-run counterstrike server.

    The action figures sell to this 18-30 age-group as birthday and christmas presents, as the sort of things you buy for the person who's already got everything. Places like Gadget Shop also do pretty well at this sort of thing. And yes, as long as this stuff will sell to their target audience, they'll sell it.

    Again, the risk these games take by being too child-friendly is they'll alienate their target audience- the 18-30 year olds with no children. It's the audience with the most expendable income; little kids need their parents to buy games for them, parents don't have as much money or time, and the older audience tends not to play as many computer games.

  • Comment number 19.

    I have to disagree with the comment made above, that Facebook is designed to appeal to young children. I enjoy playing games on the site, as to many many many others. (I am 45+, am qualified to degree level, have , children in their teens and early twenties). It is a great distraction and method of relaxing. I have also re-kindled old friendships through vetting and choosing to accept friendship requests from people I have otherwise lost touch with and vice versa. I would never use sites such as bebo, myspace or neopets as I beleive they are aimed at the younger generation. It is also a great place to share photos with friends and families that you may not see very often.

    So no, I do not believe it is designed to appeal to youngsters and I like colourful, cheerful applications. I also know from experience, that when I became aware of what I felt to be an abusive 'page' I clicked the report abuse button and within approx 2 hours, the page was gone.

    Parents DO have a responsibility to be aware of their childrens activities on line, to educate them, to initiate parental controls as to what is/is not displayed. We cannot perform miracles but it is no different to any other activity our children partake of... when they say they are going to the cinema with their friends.... or visiting a friends house... or they leave the house in the morning to attend school. We do our best to instill a sense of right and wrong, an awareness of what is safe/not safe.... and to know who to turn to if they are concerned about something, even if it isn't us...

  • Comment number 20.

    Ohh one last thing. If anyone doubts my last comment, try googling

    "This is why 11yr olds shouldn't raid"

    2 million hits over 3 years... that's around 60000 hits a month. 6 times the number of hits on Ceop.

    The kids do it all themselves....

  • Comment number 21.

    Unfortunately, CEOP has been careless and let this domain lapse and it now links to sex webcams!

    Good grief. That really is spectacularly inept.

    Anyone still think CEOP know what they're doing better than Facebook?

  • Comment number 22.

    Urghh, I just checked; you're right, CEOP is actually linking to dodgy sites.
    Facebook certainly aren't going to be entertaining any requests on this one, then; it's a security risk!

  • Comment number 23.

    Facebook has no obligation to protect under 13s as they are already forbidden by the ToS. Just because it's a popular website it doesn't mean that adding a link will make any difference. The real real responsibility for protecting children from online dangers belongs to the parents.

    The web is never, ever going to be 100% safe and secure for children who don't know what they're doing. The most important thing is that children themselves are educated and given an understanding of what can go wrong. This responsibility falls to the parents, not a single link on a site they may be using. When I was young I was taught about not going with strangers, bad places, playing in roads. This is no different.

    the "instant access to advice" should be from the child's parents, who should research the dangers of the web before allowing them on. Generally I'm not bothered if a parent fails to look out for their children, until they pressure the government to do it for them, resulting in a law which restricts everyone.

  • Comment number 24.

    It seems CEOP has now updated the VGT web site and removed the link.

  • Comment number 25.

    You can still see the previous version of the VGT site in Google's cache, and see the site it linked to at

    It seems there's the beginnings of a story here; so far we know:
    - That Jim Gamble's organisation doesn't understand how to renew a domain registration,
    - Don't notice a porn link on their own front page,
    - Respond to a complaint of bullying by fobbing the victim off with a phone number for someone else.

    It seems pretty clear that Jim Gamble's campaign is entirely for the promotion of Jim Gamble, since his sites show no interest in actually helping children, nor any degree of basic understanding of the internet. Does anyone know where to find a journalist that can follow this up? :-)

  • Comment number 26.

    Agree with most of the comments here. Just another case of a poorly run government agency interfering with business. Facebook has plenty of ways of reporting abuse and I'm sure they know what they're doing.

    Whilst we're on the topic, the "CEOP REPORT" button (yes that's what it actually says) with the funny stickman logo doesn't make it clear what it does! Have they even tested it properly to see if it works? ... I'm sure that the likes of Facebook could enlighten them on how to test if a certain functionality is understood by users or not.

  • Comment number 27.

    I don't see how CEOP could do what they claim. If someone posts something on my wall and I report it to CEOP, they can't see my wall, so what can they do? Even if they could see my wall, they're still going to have to ask Facebook to do anything about it. I can't really see how introducing a middle-man helps, except possibly adding a third party who can audit Facebook's responsiveness to reports.

  • Comment number 28.

    In my view it should always be:

    User -> Site -> CEOP

    This is how many sites do it, it's how online games do it and it's how Facebook probably would actually deal with a serious case. CEOP is inept and out of it's depth and this is nothing more than PR. Quite frankly it's ridiculous to think that you'd report it to CEOP, who'd then go back to the site to ask them for the content.

    Bebo has simply managed to make its site more confusing for users. Now they (instead of Bebo who can now monitor less) have to decide whether it's:

    Bebo: Bulling / harassment / Sexually explicit content / Vulgar language / Network abuse (e.g. spam, hacking. viruses)

    or

    Bebo: Cyberbullying / Hacking / Viruses


    So for each of CEOP's specialities, it could already have been reported and passed on. In short CEOP visibly adds nothing to the system.

    PR and nothing more.

  • Comment number 29.

    When will people like Ceop and their supporters realise they cannot and should create a nanny state?!

    If a parent cannot or will not supervise their child in the use of a computer they have no business letting them use one at all, but because it's a convenient baby sitter and the child is stuck in the same place for hours parents seem to think it's marvellous just to let them get on with it.

    Responsibilty, much like charity, begins at home.

    Oh and gurubear you are right of course with regards to the way 18+ games and the like are marketed but who's more at fault, the ill-informed parent for buying the game and then allowing their child(ren) to play it or the equally ill-informed shop staff for selling the game to said parents?

    I know which I'd pick.

  • Comment number 30.

    When will people like Ceop and their supporters realise they cannot and should not create a nanny state?!

    If a parent cannot or will not supervise their child in the use of a computer they have no business letting them use one at all, but because it's a convenient baby sitter and the child is stuck in the same place for hours parents seem to think it's marvellous just to let them get on with it.

    Responsibilty, much like charity, begins at home.

    Oh and gurubear you are right of course with regards to the way 18+ games and the like are marketed but who's more at fault, the ill-informed parent for buying the game and then allowing their child(ren) to play it or the equally ill-informed shop staff for selling the game to said parents?

    I know which I'd pick.

  • Comment number 31.

    I've watched this with interest, especially the comments about it confusing the issue (ok so who does the kid talk to? facebook? ceop? their parents? no one?) and delays in Ceop having to tell facebook about a problem, Ceop getting all media nasty at facebook for not dealing with things their way, etc, etc.

    I think a much better approach would be for the social networking sites to voluntarily come up with a common interface to their abuse systems. They may have internal systems based on their needs, sides, target audience beyond that but... If you had a "report abuse" method that had a common look and feel, common icon for it, made it straightforward to find no matter what site you were on - surely that would go a long way to solving what a lot of people are indicating they'd like - a way for their kids to feel they can speak up if someone bullies them online?

    This would then let Ceop get on and do what it should be... a central resource for the sites to go to if their internal procedures find something which requires police intervention?

  • Comment number 32.

    Calaba wrote:

    Ohh and gurubear - those games ARE aimed at the 18-30's. Most online games are (even if they don't state it), "pro" players will often restrict access to clans/guilds to 18+. This isn't because they don't want little kids to play online games, but because under 18's tend not to have control over their own lives...

    ###

    Sorry, I have been marketing these sorts of products for years and years. And worked on games from Takeru (many years ago) to Wurm Online now.

    There is a massive difference from what they say publicly and in their terms and conditions and what they bank on actually happening.

    It used to be that with people under 18 you sold to parents because the young person did not have disposable incomes. That has changed now, and many of the online games have very intentionally moved their payment system to include those that allow young people without paypal or credit card access to pay online.

    At Wurm Online, where we have young teens as well as adults play openly, we are constantly nagged to have young person friendly payment systems like all the other online games.

    And we know these young lot play loads of your adult only games because they chat about them constantly; bores me silly, in fact! :)

    As for having no control over their own lives, young teens often having laptops, and using them from their bedrooms, we find the average 14 year old spends twice as much time playing than most adults (and Wurm is a VERY time consuming, difficult game), and are often much more highly skilled than the adults. If there is an area where they can fall down (though adults do this too) it is in getting on with people. Wurm, being a game where you can set up villages with others, terraform the land, build houses, farm and so on, it works best when people get on with each other and are patient with each other. Even on the warring Wild server.

    To be honest, the other worry I have had is with some of the excessive time of play. I am not the only person who has nagged younger village members to go and do their homework because we are aware they are spending too much time in the game.

  • Comment number 33.

    This is just more rubbish from people who obviously don't actually use the sites themselves. Everything is lockdownable or reportable in the normal senses.
    If you don't wnt to be seen or don't want to be sent emails from people you don't know you can set things up that way. If you do get dodgy emails or messages then you can report people. If someone is hassling you on a group then you can report them. If someone started up a group saying things about you you can report them and it will be taken down.

    Where is the need for a stupid button that only takes you to another website anyway?

  • Comment number 34.

    ravenmorpheus2k wrote:

    Oh and gurubear you are right of course with regards to the way 18+ games and the like are marketed but who's more at fault, the ill-informed parent for buying the game and then allowing their child(ren) to play it or the equally ill-informed shop staff for selling the game to said parents?

    I know which I'd pick.

    ###

    I agree, the first responsibility is with the parent. However, many parents are lousy at this, or just give in a lot. The choice then is either someone else has to take responsibility, or we let the kids suffer the consequences - that latter just does not seem right.

    The companies have a role - if they are earning income (either directly or from advertising) because at least part of their user base is young, then they MUST take responsibility. They are earning from it after all.

    And the government has a role here too. This government (and previous governments) have been accused of the Nanny State because they do something that parents should be doing. The problem is, however, that many parents often are NOT doing their job, if you will, so what other option is there?

    Take the case of the parents who objected to Jamie Oliver's school dinners campaign and were handing over pie and chips to their kids across the school wall (with some tabloid encouragement, I suspect). How do you deal with that? You cant ignore it, but if you try and solve it you are accused of interfering. No win.

  • Comment number 35.

    Just out of interest, is there any evidence that CEOP have actually helped to prevent a single child from being abused?

  • Comment number 36.

    Gurubear - good points, but first, why does a 14-year-old have a laptop in their bedroom, with internet access? Even if they could afford the games from a part time job, that seems like poor parenting to me. The buck pretty much stops there, as parents are far more able to solve this than ANY government agency. And they can do it for free, without wasting taxpayers money.

    Meanwhile, at what point do we expect kids to accept responsibility for themselves? 14? 15? By 16, they've finished compulsory eduction anyway.

    To me, 14 seems fine. By then they've started their GCSEs. It's their own fault if they fail.

    Anyway, I think the point you're trying to get at with regards to age is a little like teenage magazines. If you "aim" the magazine at the 16-18-year-old age group, it's going to be picked up by 11-14 year olds. It's just how it goes. I reckon it's the same reason we have such issues with alcohol in this country; by setting the age limit at 18, it practically guarantees that 15-16 year olds will want to drink.

    But it's the same with games - aim for the 18-30's, and you'll have kids playing too. It's inevitable. But you can't turn back the clock, you can't un-invent 18+ games. They're still coming, like it or not. The blame on this one is with the parents - as with the Jamie Oliver example you gave, without parental support you can't do anything about it.

    Try to make facebook too kiddy-friendly, and the 18-30 group will just go elsewhere which is LESS kiddy friendly. Quickly followed by everyone else.

  • Comment number 37.

    OK, it's the responsibility of parents to supervise their children on facebook et al. Until the child reaches 18 then they are legally and adult. I can't imagine any 16-18 yr olds liking their parents hovering over them but the child should have been brought up to understand that they can talk to their parents about ANYTHING.

    Anyway, off-topic: Rory, it is now well over a week since you mentioned Ubuntu I have waited and waited to hear your comments, read the article and see what you thought about it. Clearly when you made the first "24 hours with Ubuntu" blog, you were just paying lip-service to it and clearly didn't mean it when you said you were going to do a full write up about it. To say I am surprised would be a lie. It was clear from your original blog that you had no real interest in it and it seems now you're not mentioning it at all anywhere, in the vein hope that your readers simply forget you ever mentioned Linux.

  • Comment number 38.

    Calaba wrote:

    good points, but first, why does a 14-year-old have a laptop in their bedroom, with internet access?

    ###

    Oh, I quite agree. For instance we have PCs downstairs in our house (except in my office). That way we can have some vague chance of monitoring possible problems and it has the advantage that the family still share the occasional conversation.

    My point is, whether it is Facebook, Bebo, a game or whoever, if the company KNOWS that their service is used by people younger than intended, especially if it is thousands and thousands, then they have to take responsibility somewhere down the line. You should simply say, well, they should not have been on anyway. They are on, and will continue to be on, with or without parental knowledge.

    It is not a simple issue, and the opinions of the users can make it an even bigger headache. I quote one user on Wurm Online who was furious that he had been muted for swearing on the main chat channel:

    "I thought this was meant to be a game for adults!!!! Why are you censoring me?" Or, ummm, words to that effect.

    The answer was simple - An adult is someone who knows when not to swear.

  • Comment number 39.

    "38. At 5:06pm on 19 Nov 2009, Gurubear wrote:

    Calaba wrote:

    good points, but first, why does a 14-year-old have a laptop in their bedroom, with internet access?

    ###

    Oh, I quite agree. For instance we have PCs downstairs in our house (except in my office). That way we can have some vague chance of monitoring possible problems and it has the advantage that the family still share the occasional conversation.

    My point is, whether it is Facebook, Bebo, a game or whoever, if the company KNOWS that their service is used by people younger than intended, especially if it is thousands and thousands, then they have to take responsibility somewhere down the line. You should simply say, well, they should not have been on anyway. They are on, and will continue to be on, with or without parental knowledge.

    It is not a simple issue, and the opinions of the users can make it an even bigger headache. I quote one user on Wurm Online who was furious that he had been muted for swearing on the main chat channel:

    "I thought this was meant to be a game for adults!!!! Why are you censoring me?" Or, ummm, words to that effect.

    The answer was simple - An adult is someone who knows when not to swear."

    Whilst you do have a fair point, the buck has to stop with the parents.

    And organisations like Ceop have done nothing for parental responsibility here in the UK.

    Facebook have sufficient measures in place already. Ceop is irrelevant in terms of the "service" they can provide and a button isn't going to change that.

    The real responsibility should be taken by parents, and yes it is that simple.

    If parents are concerned about their children's surfing habits then they should do something about it, not look to government organisations who in the end only impede those of us who don't have a problem with surfing the web by way of legislation and whatnot, to do something for them.

    And quite frankly I'd be more concerned if my child was "hanging out" on street corners each night, being anti-social, as many are, than if they are on facebook where ceop don't have a button.

    Parents in this country need to take responsibility instead of looking for others to blame, in fact that applies to people in general here in the UK, we have an increasing culture of blame (and greed associated with that blame, take the recent case of the 3yr old who was awarded compo), imported from the US no doubt, and it must stop, everyone is responsible for their own actions and of their own children not someone else's children and this idea should be passed on to the younger generations by their parents.

    BTW I apologise to all for the double post this morning, I was posting via BlackBerry and the signal was interrupted (on the train going through a "dead area") and I edited the post as I didn't think it would have made it through the system. Odd that the moderators let it through twice though...

  • Comment number 40.

    This comment was removed because the moderators found it broke the house rules. Explain.

  • Comment number 41.

    I’m an ICT teacher I have to deal with this sort of thing every year. I promote Internet Safety to all year groups but still children do give out personal information to strangers.

    SheffTim comments are disgraceful; CEOP are doing an unbelievable job I have been trained by them to teach children about internet safety and their resources are amazing. What they taught me would make anybody skin cruel. They save so many children in the UK and abroad from dangers because of their training, their dedicated staff and these safety buttons on websites. They do work!!!

    Yes parents should be made aware but it’s only a button!!!! What is all the fuss about? If this helps one child then it’s done its job.
    Facebook should be ashamed!

    Button = safe kids = nasty disgusting people locked up.

  • Comment number 42.

    One or two comments here appear to be more about personal attacks than constructive debate. Some (not all) contributors to this debate seem to have a greater understanding of the internet than the psychology of offending or child protection. It seems to me that the button does not replace the methods used by the sites to inform and protect users but simply adds to them, i.e. gives the user a choice. What is wrong with that?

  • Comment number 43.

    hacker jack is right i completely agree facebook already has a system that works and who about bebo there closing down soon anyway heck they've already closed down austrilian operations where facebook rules down there

    facebook should not bend to every little whim off our stupid country i mean have you even looked at ours laws

    facebook has its own perfectly working report system it works i've used

  • Comment number 44.

    okay heres somthing intresting ceop is goverment agency correct so why are they registar as trading on dun and brad street and why is the minstry of justice also registared as trading i smell a cover up

  • Comment number 45.

    The ceop logo looks pretty illuminati. :/

  • Comment number 46.

    As this is back into the news I wanted to add to the already interesting thread started here.

    Facebook has become the leading Social Network site and appeals to all ages. They have checking and reporting facilities however age verification is very easy to get round and therefore younger users will always slip through.

    On other sites appealing to younger users there is the recognisable CEOP button, which is being promoted as the uniform panic button to press should anyone feel threatened - Facebook seem happy to take the traffic, sign up the users and cash in on the page impressions they generate so is it so unreasonable to expect them to join forces and put a button on?



 

The ΒιΆΉΤΌΕΔ is not responsible for the content of external internet sites

ΒιΆΉΤΌΕΔ.co.uk