The convincing 'deepfake' video, however, is another reminder that seeing is no longer believing.
It is getting harder and harder to spot so-called deepfake videos as the technology gets better with software to map someone's face on to another person's body and, in some cases, use their voices taken from previous recordings.
The internet is flooded with deepfake videos of political figures such as outgoing US President Donald Trump, Russian president Vladimir Putin and North Korean leader Kim Jong-un, and there are major concerns it will be used to spread disinformation in an increasingly polarised world.
However, the technology isn't just limited to politicians or heads of state. Celebrities are also regularly targeted in deepfake videos, some of which are just intended to be funny.
As always, there is also a much more sinister side because almost as soon as the technology was invented people were using it to create deepfake porn, using well-known Hollywood faces mapped on to the bodies of porn stars.
While it may be unsurprising that celebrities have found themselves the victims of deepfake porn, countless unsuspecting members of the public have also been targeted, and in many cases aren't even aware of it.
A free deepfake bot found on the Telegram messaging app has this year produced hundreds of thousands of images of women to make it look like they are naked. The images were uploaded without the target's consent by people who took them off social media, private pictures and the wider internet.
The pictures are shared and rated among more than 100,000 members of a page on the app.
A browser extension called DeepNude was launched in 2019 offering to create deepfake nudes of women for $50 but it was later shut it down after a backlash.
There are also fears that deepfakes will increasing be used by fraudsters. In 2020 numerous financial organisations hired technology services to help them detect and block identity-based cyber attacks such as deepfake videos used to impersonate clients.
Legislators around the world are also starting to bring in laws to tackle deepfakes.
In Ireland, people making deepfake porn will now be prosecuted covered under revenge porn legislation which was signed into law this week.
US states have also passed laws to tackle deepfakes. In November New York governor Andrew Cuomo signed a law protecting actors from deepfake sexually explicit material and others exploiting their name, image and voice before and after their death.
UK broadcaster Channel 4, which produced the deepfake of the Queen, said it did so to give stark warning about fake news in the digital age.
More than 200 people complained to the UK media watchdog Ofcom over the programme but Channel 4 defended it.
"It is very clear in the four-minute film that it is a parody of the Christmas Day address and viewers were left in no doubt that it was not real," a Channel 4 spokeswoman said.
"However, while the film is light-hearted, affectionate and comedic in tone, it carries a very important and timely message about trust and the ease with which convincing misinformation can be created and spread."