Kaspersky Lab has published a phishing report that analysed the dramatic increase of cybercriminal campaigns designed to steal users’ Apple IDs and account information by creating fraudulent phishing sites that try to imitate the official apple.com site.
Cybercriminals are using the fake Apple sites to try and trick users into submitting their Apple ID credentials, which would enable the criminals to steal the account login and access the victim’s personal data, information and credit card numbers stored on their iCloud and iTunes accounts.
From January 2012 through May 2013, Kaspersky Lab’s cloud-based Kaspersky Security Network (KSN) detected an average of 200,000 attempts per day of users trying to access the phishing sites, which were triggered each time a user running Kaspersky Lab’s products was directed to one of the fraudulent sites.
The increase in average detections is a marked increase compared to 2011, which averaged only 1000 detections per day. Kaspersky Lab’s web antivirus module successfully detected and prevented its users from accessing the sites: however, the increase in detections shows how these scams are becoming more commonly used by cybercriminals for phishing campaigns.
Kaspersky Lab’s experts analysed the cybercriminals’ behaviour and patterns on a daily and monthly basis, noticing that fluctuations and increases in phishing attempts often coincided with large events from Apple. For example, on December 6, 2012, immediately following the opening of iTunes stores in India, Turkey, Russia, South Africa and an additional 52 countries, Kaspersky Lab detected an all-time record of more than 900,000 phishing attempts directing to fake Apple sites in a single day.
Phishing Emails Posing as Apple:
The main distribution method used by cybercriminals to direct users to the fraudulent Apple sites are predominantly phishing emails posing as Apple Support with fake alias names in the ‚”Sender‚” field, such email@example.com. The messages would typically request users to verify their account by clicking on a link and entering their Apple ID information. These emails are deceptively clever and professionally designed in order to make them appear authentic, including the use of Apple’s logo and presenting the message with similar formatting, colouring and style that Apple uses.
Another variation of these phishing emails are designed to steal Apple customers’ credit card information. This is done by sending users an email requesting that they verify or update the credit card credentials attached to their Apple IDs, which can be done by clicking on a link in the message. The link directs the user to a phishing site that imitates how Apple requests credit card information from their customers to fool users into inputting their credit card information and other personal information.
Guidance to Users – Identifying Phishing Websites and Emails:
One way to distinguish between real websites and counterfeit ones created for phishing purposes is to look at the address bar of the website. While most counterfeit sites have the word ‚”apple.com‚” as part of their address (URL), the address would not be verified by Apple and would include additional text in the URL.
However, identifying phishing sites becomes harder when users can’t see the full URL address, which is typically the case when iOS users are running Safari on their iPhone or iPad devices. When users click on links from email messages on iOS devices the complete URL address is hidden from them when the page is downloaded and opened through Safari.
How Apple Users can Protect Themselves against Phishing Scams:
Users should verify email address aliases from Apple by checking the original sender address first. On a computer this can be done by mousing over the sender address field, which reveals the sender alias’ true email address. When using a mobile device, users should touch the email alias from the sender, which expands the alias to show the full address of the sender.
To guard against fraud attempts, Apple also provides a two-step authentication process for Apple IDs. This process involves sending a four-digit code to one or more previously selected devices belonging to the user. This serves as an additional verification and prevents undesired changes being made on the ‚”my Apple ID‚” site or, for example, third parties making unauthorised purchases using your Apple ID.
Unfortunately, this does not yet prevent cybercriminals from using stolen credit card data. Users should not follow links in questionable emails to access websites. Instead, they should manually enter website addresses into browser windows. Users who still want to use such links should carefully check their content and the address of the website they link to. In addition, Mac users should use a security software package likeKaspersky Security for Mac as standard. This will protect Mac users in real-time against viruses, trojans, spyware, phishing attempts and harmful websites, as well as preventing Macs from distributing Windows malware to friends and colleagues.
* Follow Gadget on Twitter on @GadgetZA
Prepare for deepfake impact
Is the world as we know it ready for the real impact of deepfake? CAREY VAN VLAANDEREN, CEO at ESET SA, digs deeper
Deepfake technology is rapidly becoming easier and quicker to create and it’s opening a door into a new form of cybercrime. Although it’s still mostly seen as relatively harmful or even humorous, this craze could take a more sinister turn in the future and be at the heart of political scandals, cybercrime, or even unimaginable concepts involving fake videos. And it won’t be just public figures that bear the brunt.
A deepfake is the technique of human-image synthesis based on artificial intelligence to create fake content either from scratch or using existing video designed to replicate the look and sound of a real human. Such videos can look incredibly real and currently many of these videos involve celebrities or public figures saying something outrageous or untrue.
New research shows a huge increase in the creation of deepfake videos, with the number online almost doubling in the last nine months alone. Deepfakes are increasing in quality at a swift rate, too. This video showing Bill Hader morphing effortlessly between Tom Cruise and Seth Rogan is just one example of how authentic these videos are looking, as well as sounding. If you search YouTube for the term ‘deepfake’ it will make you realise we are viewing the tip of the iceberg as to what is to come.
In fact, we have already seen deepfake technology used for fraud, where a deepfaked voice was reportedly used to scam a CEO out of a large sum of cash. It is believed the CEO of an unnamed UK firm thought he was on the phone to his boss and followed the orders to immediately transfer €220,000 (roughly US$244,000) to a Hungarian supplier’s bank account. If it was this easy to influence someone by just asking them to do it over the phone, then surely we will need better security in place to mitigate this threat.
Fooling the naked eye
We have also seen apps making DeepNudes where apps were able to turn any clothed person into a topless photo in seconds. Although, luckily, this particular app has now been taken offline, what if this comes back in another form with a vengeance and is able to create convincingly authentic-looking video?
There is also evidence that the production of these videos is becoming a lucrative business especially in the pornography industry. The BBC says “96% of these videos are of female celebrities having their likenesses swapped into sexually explicit videos – without their knowledge or consent”.
A recent Californian bill has taken a leap of faith and made it illegal to create a pornographic deepfake of someone without their consent with a penalty of up to $150,000. But chances are that no legislation will be enough to deter some people from fabricating the videos.
To be sure, an article from The Economist discusses that in order to make a convincing enough deepfake you would need a serious amount of video footage and/or voice recordings in order to make even a short deepfake clip.
Having said that, In the not-too-distant future, it may be entirely possible to take just a few short Instagram stories to create a deepfake that is believed by the majority of their followers online or by anyone else who knows them. We may see some unimaginable videos appearing of people closer to home – the boss, our colleagues, our peers, our family. Additionally, deepfakes may also be used for bullying in schools, the office or even further afield.
Furthermore, cybercriminals will definitely use such technology to spearphish victims. Deepfakes keep getting cheaper to create and become near-impossible to detect with the human eye alone. As a result, alt that fakery could very easily muddy the water between fact and fiction, which in turn could force us to not trust anything – even when presented with what our senses are telling us to believe.
Heading off the very real threat
So, what can be done to prepare us for this threat? First, we need to better educate people that deepfakes exist, how they work and the potential damage they can cause. We will all need to learn to treat even the most realistic videos we see that they could be a total fabrication.
Secondly, technology desperately needs to develop better detection of deepfakes. There is already research going into it, but it’s nowhere near where it should be yet. Although machine learning is at the heart of creating them in the first place, there needs to be something in place that acts as the antidote being able to detect them without relying on human eyes alone.
Finally, social media platforms need to realize there is a huge potential threat with the impact of deepfakes because when you mix a shocking video with social media, the outcome tends to spread very rapidly and potentially could have a detrimental impact on society.
A career in data science – or your money back
The Explore Data Science Academy is offering high demand skills courses – and guarantees employment for trainees
The Explore Data Science Academy (EDSA) has announced several new courses in 2020 that it says will radically change the shape of data science education in South Africa.
Comprising Data Science, Data Engineering, Data Analytics and Machine Learning, each six-month course provides vital digital skills that are in high demand in the market place. The full time, fully immersive courses each cost R60 000 including VAT.
The courses are differentiated from any other available by the fact that EDSA has introduced a money back promise if it cannot place the candidate in a job within six months of graduation and at a minimum annual starting salary of R240 000.
“For South Africans with drive and aptitude, this is the perfect opportunity to launch a career in what has been called the sexiest career of the 21stcentury,” says Explore founder Shaun Dippnall.
Dippnall and his team are betting on the explosive demand for data science skills locally and globally.
“There is a massive supply-demand gap in the area of data science and our universities and colleges are struggling to keep up with the rapid growth and changing nature of specific digital skills being demanded by companies.
“We are offering specifically a work ready opportunity in a highly skills deficient sector, and one which guarantees employment thereafter.”
The latter is particularly pertinent to young South Africans – a segment which currently faces a 30 percent unemployment rate.
“If you have skills in either Data Science, Data Engineering, Data Analytics or Machine Learning, you will find work locally, even globally. We’re confident of that,” says Dippnall.
EDSA is part of the larger Explore organisation and has for the past two years offered young people an opportunity to be trained as data scientists and embark on careers in a fast-growing sector of the economy.
In its first year of operation, EDSA trained 100 learners as data scientists in a fully sponsored, full-time 12-month course. In year two, this number increased to 400.
“Because we are connected with hundreds of employers and have an excellent understanding of the skills they need, our current placement rate is over 90 percent of the students we’ve taught,” Dippnall says. “These learners can earn an average of R360 000 annually, hence our offer of your money back if there is no employment at a minimum annual salary of R240k within six months.
“With one of the highest youth unemployment rates in the world – recently announced as a national emergency by the President – it is important that institutions teach skills that are in demand and where learners can earn a healthy living afterwards.”
There are qualifying criteria, however. Candidates need to live in close proximity (within one hour commuting distance), or be prepared to live, in either Johannesburg or Cape Town, and need to be between the ages of 18 and 55.
“Our application process is very tough. We’ll test for aptitude and attitude using the qualifying framework we’ve built over the years. If you’re smart enough, you’ll be accepted,” says Dippnall.
To find out more, visit http://www.explore-datascience.net.