Originality.ai Review (2023): The Best AI Content Detector?

Originality.ai is an AI-writing platform that offers an AI content detector and a plagiarism checker.

These days AI writing is more common than ever. Thanks to the latest advancements in AI, you can write professional, well-thought, and unique content in seconds—indistinguishable from the human-written text.

But there’s always a concern about the originality of the content. Besides, search engines don’t favor AI-written content because it tends to bring no actual value to the readers.

You might find it useful to try an AI content detector, like the one Originality.ai offers. It detects AI-written content with high accuracy even when the content looks to be human-written.

This is a comprehensive review of Originality.ai’s AI content detector.

I ran extensive tests to see whether the tool actually works or not with both human and AI-written inputs. I also tried some cheap tricks to fool the AI detector.

đź’ˇ Make sure to read my comprehensive guide to the Best AI Content Detectors.

This article is supported by readers. If you make a purchase through one of the links, I may earn a small commission at no cost to you.

What Is Originality.AI Content Detector?

Originality.ai is an AI writing software that helps you detect if your content:

  • Is written by AI.
  • Has plagiarism.

These are both important aspects when it comes to using AI in your writing.

Even though these days AI-written content looks human-written, there are programs that can still detect whether the content is written by AI or not. Originality.ai demonstrates this well.

Also, even though AI mostly produces unique and original content, there’s always a chance of plagiarism. This is why it makes sense to run AI-written content through a plagiarism checker to be on the safe side.

Let’s test Originality.ai to see how it detects AI-written and human-written content.

Performance

To get an idea of the performance of Originality.ai’s AI content detector, I fed the tool:

  • 10 human-written text samples.
  • 10 AI-written text samples.

And calculate the accuracy of the tool based on these inputs.

Besides, I tried to trick the AI content detector with some easy tricks to make it think the AI-written content was human-written.

1. Human-Generated Content

As the first test, let’s try to analyze human-written content with Originality.ai.

To pull this off, I’ve taken 10 test samples from my blog posts. These pieces of content are 100% human-written. In an ideal world, the AI detector should score it all 100% original, and 0% AI.

Example 1

Here’s the input:

Output:

76% original—mission successful.

Example 2

Here’s the input:

And here’s the output from AI detector:

4% original—mission failed.

Example 3

Input:

Output:

2% original—mission failed.

Example 4

Input:

Output:

83% original—mission successful.

Example 5

Input:

Output:

8% original—mission failed.

Example 6

Input:

Output:

82% original—mission successful.

Example 7

Input:

Output:

23% original—mission failed.

Example 8

Input:

Output:

99% original—mission successful.

Example 9

Input:

Output:

0% original—mission failed.

Example 10

Input:

Output

2% original—mission failed.

Based on these results, it’s clear to see that Originality.ai isn’t the best tool for recognizing human-written content.

In the above tests, Originality.ai only recognized 4/10 human-written pieces of text right.

But this is not a big problem. The tool is supposed to detect AI-written content. Most people are going to use this tool knowing that the content is AI written and to make it sound less like AI.

2. AI-Written Content

As another test, let’s feed originality.ai some AI-written text samples. I’ve generated these samples using ChatGPT.

In an ideal situation, originality.ai would give a 0% original, and 100% AI score. But detecting AI with 100% accuracy is hard.

For our purposes, let’s count all missions successful where the AI score is greater than the original score.

Example 1

Input:

Output

100% AI written—mission successful.

Example 2

Input:

Output

99% AI written—mission successful.

Example 3

Input:

Output

61% AI written—mission successful.

Example 4

Input:

Output

66% AI written—mission successful.

Example 5

Input:

Output

1% AI written—mission failed.

Example 6

Input:

Output

98% AI written—mission successful.

Example 7

Input:

Output

100% AI written—mission successful.

Example 8

Input:

Output

100% AI written—mission successful.

Example 9

Input:

Output

70% AI written—mission successful.

Example 10

Input:

Output

59% AI written—mission successful.

Pretty impressive! The AI detector was able to successfully identify 90% of the inputs as AI-written.

Last but not least, let’s see if we can fool the AI detector by making a small change and get a big shift in the score.

Can You Fool Originality AI Content Detector?

I’ve tried a bunch of AI content detector tools. I’ve noticed that sometimes even the tiniest change in the input completely changes the score.

So to make 100% AI-generated content look 100% human-generated, it might be enough to just remove a single letter or add one extra word in the mix.

Let’s run some cheap tricks to see if this is also the case in Originality.ai.

Test 1: Remove a Comma

➡️ TLDR; Adding a small grammatical error did not change the output of the Originality.ai AI detector significantly.

As a first test, let’s introduce a small grammatical error in the Originiality.ai input. More specifically, I’ll remove the comma that follows the word “Additionally”:

This didn’t change the originality score significantly. It seems the detector doesn’t care about small changes and can see the big picture.

Test 2: Make a Typo

➡️ TLDR; Adding a small typo to the content did not significantly change the AI detector output.

Next, let’s try to misspell one of the words to see if such a small change would alter the AI detection score.

I’m going to intentionally write “their” as “ther” without “i” and scan the content:

Once again, this only slightly moved the score. As expected, such a small change shouldn’t alter the score by much.

Test 3: Use an AI Paraphraser

➡️ TLDR; Rephrasing the AI-written content did not fool the Originality.ai detector.

Based on a couple of tests, it seems small input changes aren’t able to change the score.

Now, let’s try something more significant.

These days, you can also use AI to paraphrase your content. One example of such a rewording tool is called QuillBot.

Make sure to read my complete QuillBot review.

Here I’ve reworded one of my AI-generated inputs. With QuillBot, making this change took only a second.

Now, let’s try to input this AI-generated rephrased sample text into the Originality.ai detector to see what happens:

Amazing! It still recognizes the content to be AI-generated. Some other tools failed this test miserably.

Keep in mind that even though I ran quite some tests, the sample size is still quite small. I highly recommend you experiment with tests like this. Change words, change punctuation, add a sentence, or remove a sentence.

Plagiarism Checker

Notice that Originality.ai is not only an AI content detector. There’s a plagiarism checker too!

For example, here I’ve copy-pasted a part of a blog post from my other blog.

The originality.ai plagiarism score is correctly 100%. The sample I checked is indeed 100% copied from an existing blog post.

It seems to be a really powerful plagiarism checker based on the few examples of duplicates I ran through it.

If you already tried Originality.ai, you probably noticed that the plagiarism checker is automatically enabled when you detect AI content.

You can uncheck the plagiarism checker if you’re only interested in checking the content for AI.

Pros

  • Accurate. Originality.AI is a really accurate tool when it comes to detecting AI-written content. In my case, it spotted 90% of AI-generated content right!
  • Detects plagiarism. There’s also a powerful plagiarism checker tool that you can use to ensure your writing is truly unique.
  • Hard to fool. You can’t just change a word or punctuation to fool the AI detector to give a better score. This seems obvious but it’s not the case for many AI detector tools.

Cons

  • No free trial. Originality is a paid tool without a free trial. But luckily, the pricing is affordable at $1 for 10,000 scanned words.
  • False positives. Originality.ai isn’t good at telling human-written content apart from AI-written text. Out of my 10 human-written samples, it claimed 6 to be AI-generated.

Final Verdict

I recommend experimenting with a tool like Originality.ai. It can help you write less AI-like content as well as be sure you’re not copying someone else’s work.

Originality.ai works best when you already know that the input text is written by AI and you want to make it sound less like AI.

But if you’re given a random piece of text, you can’t rely on Originality.ai to tell whether it’s AI-written or not.

However, unfortunately, an AI detector like Originality.ai (or any other publicly available tool) isn’t enough to tell if Google still thinks your content is written by AI. They most likely use a different approach.

So if you are a blogger and get a 100% original score from Originality, it does not mean Google wouldn’t still be able to tell it’s written by AI.

Read Also