Skip to content

UPDATE: Tasteless GIF removed

A Vancouver Island father is concerned after discovering the GIFs available in his phone’s texting function
web1_170613-t-BPD-SUICIDE-GIF
Devin Perfect discovered this GIF option when texting on his iPhone. (Screenshot)

UPDATE: June 15

Apple has removed a tasteless GIF that prompted a Vancouver Island dad to speak out.

While Apple has not released any official response to this article or our request, the GIF mentioned below has since been removed from its iMessage GIF searching function.

Since that GIF searching ability was first released by Apple last year, the company has been plagued with problems of inappropriate GIFs.

In September 2016, porn was discovered in the available GIFs and Apple announced it was cracking down on the images available. An ongoing battle for the tech company.

As for the concerned dad, Devin Perfect said he is pleased the suicide GIF has been removed, but he thinks far more needs to be done.

“It is good that it is pulled, but it looks that it has happened before,” said Perfect. “As much as I appreciate that they pulled it, there needs to be more. Some form of filter or parental filter we can put on it.”

Perfect said he has done more of his own research and he was shocked at some of the other inappropriate images he discovered.

“I found out that there was porn at one point in the GIFs, and it appears there is no way to opt out of that feature or control that feature,” said Perfect. “So, I think about my kids using it and I want to have some kind of control over their access.

“There needs to be something in the technology that allows people to either opt out of that option or put a filter on there or something.”

Apple did not reply to our request for an interview by the time of posting.

—-

ORIGINAL: June 14

One father’s iPhone discovery has prompted a plea to Apple.

Devin Perfect was chatting with his siblings about nuisance deer when he went into the animated GIF portion of his iPhone text app.

He typed in ‘kill’ into his phone and was dismayed by the options that popped up.

“I used the term kill to describe how I was feeling about the deer, to see if there was something in there, and three rows down I saw it and it hit my gut, I thought ‘whoa, this is really wrong’,” said Perfect.

The GIF that caught his attention felt all too familiar, a looped black and white clip of a notepad with the words ‘maybe suicide is the answer’ on it.

“I thought of Amanda Todd right away and then I thought, ‘How did this get into the package?’ when they introduced these GIFs, why is this an OK image to have?” said Perfect. “It was quite disgusting to me and disturbing.”

As a father, Perfect fears this form of content could be used to bully someone who may already be struggling with depression and suicidal thoughts.

“I just wonder who is going to send that, who is going to use that text image,” said Perfect. “A friend of mine who is a lawyer saw it is a lawsuit waiting to happen. What if someone texted that to someone who is having a rough time, not knowing where they are in their state of mind, and that image pushes them over the edge. It could be the trigger.”

It’s a fear that hits home for him as someone who has lost loved ones to suicide.

“It is just a really stupid image. In a time when we are openly talking about mental health and supporting people and having national campaigns. It is just so horrible, the connection to Amanda Todd, it’s one of those things where I feel someone needs to get Apple to get this removed.

“The technology today is wonderful, but there is so many ways kids are abusing it. Either it was a total miss by Apple or they don’t care. I wonder what other images are in their database, perhaps they need to do a review.”

Social media educator Sean Smith says he is not at all surprised to see that that particular GIF comes up in the options.

“The one thing about the internet is that people post and use stuff that in everyday life would be unacceptable,” said Smith.

“In the online space it is still pretty much a free for all.”

Smith said that the fact Apple is ‘allowing’ this content within its program is part of a long-running discussion on the responsibility, or lack thereof, of platforms to control the content available.

“The larger companies, Google, Amazon, Apple, what is their responsibility when it comes to stuff like this? Facebook is starting to crackdown and Twitter has done some things, but the responsibilities of the platforms is secondary, in my opinion, to educating our youth.”

For Smith, this is not about controlling the content.

“We need to educate our youth to become better digital citizens. We are not educating anyone, we are saying ‘here’s a grenade and go play with it’,” said Smith.

“We’ve spent a lifetime teaching our kids to be safe. If they are going to drive, we give them driving lessons and make them take a test and get a license so they can now navigate things safely, but with this vehicle we call the internet we simply hand it over and tell them to find their own way.

“That super highway can be just as dangerous as the real thing and we’ve seen that.”

Smith recently wrote a social media post about a young man that was Smith says was ‘bullied to death’ on Snapchat.

“There is a desensitization and a lack of guidance in the online space. There is a physical separation between them and reality and they just do not understand. Eventually reality hits.”

Both Smith and Perfect agree parents need to educate themselves and step up to help their kids navigate the web and stay safe online.

“As a parent, I think it is important that parents talk about technology, how kids are using it, what is out there and how to use it, so we can all make better decisions to support our kids,” said Perfect.

“We, as adults and parents and mentors, need to be there to know where and when our children are using this content,” added Smith.

Apple did not reply to our request for an interview by the time of posting.


 

@carmenweld
carmen.weld@bpdigital.ca

Like us on Facebook and follow us on Twitter.