Written and recorded by Robert Edwards, Law Hound
Welcome to this cybercrime training module from Data Law. My name is Robert Edwards. Andi. I'm a consultant with Law Hound. Previously I waas a counterintelligence and I t security specialist with Her Majesty's government before I moved on to lecturing on providing legal training developments inside the crime, online harassment and cyber bullying using social media. This session looks at the developments in cybercrime in relation to online harassment and cyberbullying using social media and in particular, sanction 103 off the Digital Economy Act 2017. The Office for National Statistics report relating to Internet use in 2016 showed that using the Internet for social networking has grown from 45% of the population in 2011 to 63%. What is cyber bullying? Online harassment and cyber bullying can take a wide variety of forms, including trolling, sending menacing or upsetting messages. Identity theft. Something called docks ING, which is making available personal information online. That's personal information of 1/3 party cyber stalking Onda Revenge porn. Many of these offenses are committed using social media. The CPS describes social media as the use of electronic devices to create, share or exchange information ideas, pictures and videos with others by a virtual communities on networks. So, for example, messages, emails and texts onda other forms of electronic communications, it's difficult to get exact figures for the number of people who have bean victims off cyber bullying. While it seems more prevalent in the under eighteens about 50% of girls and 40% of boys it's also accepted that adults will fail to report even more serious cyber bullying. There are a number of criminal offenses which can be involved, including stalking harassment, sending malicious communications on dim, proper use of a public electronic communications network. And these potentially include the Public Order Act 1986 Section five. Threatening abusive or insulting words. Behavior writing or any visual representation is likely to cause harassment, alarm or distress within the hearing or sight off a person. Andi the Computer Misuse Act 1990 Hacking into someone else's account. The CPS refers to four categories of potential offenses. I don't intend to go into those in any great detail, but let's have a brief look at some of these potential offenses where threats are made. Offenses could include offenses against the person Act 18 61. Section 16 Threat to kill Protection from Harassment Act 1997. Conduct, which amounts to harassment or stalking on either Section. Four. Putting another in fear of violence or section for a stalking involving fear of violence or serious alarm or distressed. Malicious Communications Act 1988 Section one. Threats of Violence to the Person or Damage to Property Communications Act 2003 Section 127 prohibits the sending of messages of a menacing character by means of a public telephone communications network if it creates fear or apprehension in those to whom it is communicated or may reasonably be expected to see it. As discussed in the chambers case, the citation is on screen on Did your notes communications, which target specific individuals when a specific person is targeted, the offense may or offenses rather may include protection from Harassment. Act 1997 sections two to a four or four A relating to harassment or stalking. Serious Crime Act 2015 Section 76. Repeated or continuous controlling or coercive behaviour in an intimate or family relationship, including, for example, monitoring or tracking Using Elektronik communication, including social media, Andi or by the use off spyware on software, Criminal Justice and Courts Act 2015 Section 33. Disclosing private sexual images without consent or revenge porn to cause the victim humiliation or embarrassment. Usually an ex partner who may, for example, post on social media or share by text or email on the offense will include those who forward on, for example, a retweet without the victim's consent. If it least one off the purposes or forwarding was to cause distress to someone showed in the image. It doesn't includes someone who forwards an image on. If there was no such intent, for example, they thought it was amusing. The Protection of Children Act 1978. Taking, distributing, processing or publishing indecent photographs of a child. Section one or Section 1 60 Criminal Justice Act 1988. Possession of an indecent photograph of a child. Section 1 60 could potentially be used where the person concerned is aged under 18 even though they're over the age of consent. Sexual Offenses Act 2003 where, for example, the images are used to coerce a person into sexual activity, and these may include a Section four offense. If, as a result of the coercion, Um, adult sexual activity has actually taken place or Section eight or section 10 offenses if, as a result of the coercion, sexual activity has taken place with a child under 13 or under 16 respectively. I mean, that's the simple overview. Obviously there there's more details off the offense within those sections, but generally blackmail or attempted blackmail as applicable. If, for example, the victim is persuaded to take their clothes off or commit a sexual act using a webcam on social networking or an online dating site, enabling the defendant to record this on the defendant then threatens the victim with publication if the defendant is not paid money, bridges of court orders and statutory prohibitions. These are offenses which particularly relate to the jury's Act 1974 Sections 28 e. G. As amended, including disclosing information obtained by research to another jury member. Andi disclosing information relating to jury deliberations. The contempt of Court Act 1981. Juror misconduct, which is not covered by the statutory offenses Sexual Offenses Amendment Act 1992 Section five. Publishing material, which may lead to the identification of a complainant in a sexual offence on Children and Young Persons Act 1933 on the Youth Justice and Criminal Evidence Act 1999 automatic and discretionary reporting restrictions relating to those who are under 18 grossly offensive, indecent, obscene or false communications. Most communications which fall into these categories, will be considered under Section one off the Malicious Communications Act 1988 or Section 27 of the Communications Act 2003. Section One of the Malicious Communications Act, as amended, prohibits the sending off on Elektronik communication, which is indecent Grace Lee offensive or which is false or which the sender believes to be false. If the purpose or one of the purposes of the sender is to cause distress or anxiety to the recipient, sending the communication is sufficient since it's not necessary to prove that the communication actually reached the recipient, so it entirely hinges around intent and action. However dependent on the indecency or obscenity, level off the communication, a defendant may be charged with alternative offenses such as under Section 21 of the Obscene Publications Act 1959 for example, In the Arbors GS theme, the defendant instigated a conversation which related to his sexual fantasy involving very young Children. Section 1 27 of the Communications Act 2000 and three. This makes it a summary offence to send or cause to be sent through a public Elektronik communications network, a message or other matter that is grossly offensive or often indecent or obscene character or a false message for the purpose of causing annoyance, inconvenience or needless anxiety to another. The defendant must either intend the message to be grossly offensive, indecent or obscene, or a least be aware that it was so which, following the Collins case, the citations on your screen can be inferred from the terms of the message or from the defendant's knowledge of the likely recipient. That's DPP Re Collins. Social Media is regarded as a public Elektronik communications network because, as in the chamber case, a message on the platform, such as on a platforms such as Twitter, is accessible to anyone who has access to the Internet again. Sending the communication is sufficient. It's not necessary to prove that anyone saw the communication or was offended by it. Serious crime At 2000 and seven, a defendant who encourages others to commit a communications offense could be charged under the serious crime at 2007 by way of example, a case of someone who encourages posting tweeting or retweeting a grossly offensive message or provides personal information, making it much easier, easier for others to target the victim. This is because the result of this behavior can be a campaign of harassment against the victim. However, whilst there are a number of potential offenses, the current position has been criticized on the basis that some of these offenses predate the huge uptake in use of social media Onda that social media providers themselves lacked self regulation. The Digital Economy Act 2017 section 103 is intended to shift at least some of the responsibility back onto social media platform providers with MAWR Self regulation. Under Section 1037 a code of practice will be issued to those social media platform providers providing services to people within the UK On I'll refer to these simply as the providers. Before issuing the code of practice, there will be a consultation with social media providers, Andi, any other persons or individuals it is felt are appropriate in accordance with Section 1036 That code of practice will, under section 103 to detail what action will be appropriate for the providers to take in relation to conduct defined under Section 1033 as engaged in by a person online directed at an individual on which involves bullying or insulting the individual or other behaviour likely to intimidate or humiliate the individual on Under Section 1035 the code of practice guidance will so cover arrangements as to how users will be able to notify the providers that the relevant conduct, as defined under section 1033 has taken place or is taking place on that section. 1035 a. There's a notification prokes that process, which is section 1035 b. Andi. It also covers how this will be incorporated into terms and conditions for using the provider services under Section 1035 c. Andi Information about how the providers will deal with the relevant conduct On under Section 1038 There are provisions made for revisions. However, the guidance is not intended to substitute the actions which could be taken under law. For example, prosecuting offenses under Section 33 of the Criminal Justice and Courts Act 2015. In a case of revenge porn providers current terms Most of the more of the more widely used providers currently have something in place to deal with unacceptable behavior. So let's have a quick look at two of the main ones, the most well known as at the beginning of October 28 2017. Twitter has Twitter rules, which are to protect the experience and safety of people who use Twitter Andi to set some limitations on the type of content on behavior that we allow. They certainly cover abusive behavior, which would encompass much of the relevant conduct referred to in section 103 of the digital Economy Act 2017 including direct or indirect violent threats, harassment and hateful conduct. However, what is interesting is that while the rules say that all users must dear to these, failure to do so may result in the temporary locking on door permanent suspension off accounts. Otherwise, as you might expect, the providers take of you as a as expressed in their terms that the user understands that they may be exposed to content that might be offensive, harmful, inaccurate or otherwise appropriate, or, in some cases, postings that have bean mislabeled or are otherwise deceptive. By declaring that all content is the sole responsibility of the person who originated, the content on Twitter cannot take responsibility for such content. Facebook has community standards, which generally encouraged respectful behaviour on which would also cover much of the relevant conduct referred to in Section 103 of the Digital Economy Act 2017 including direct threats, bullying and harassment, and criminal activity. Facebook does include information about how to report abusive content or something which the user believes violates the terms. Action, which Facebook says it may take, depends on the severity of the violation on the person's history on Facebook, which would usually mean a warning for a first violation. Interestingly, Facebook also adds reporting something doesn't guarantee that it will be removed because it may not violate our policies, and not all disagreeable or disturbing content violates our community standards. And on this basis, they continue to explain how the user can customize and control what they see, of course, without seeing the code of practice which is not yet available. It's difficult to see whether providers are currently doing enough. But their existing policies in terms would appear to be insufficient. So does section 103 go far enough? Well, only time will tell how effective the regulations will be and what impact they will actually have on social media on the crimes which occur many of which, of course, still go unreported. And that concludes this session for data Law. Thank you for joining me, Robert Edwards, on this session.
00:18:43