





1:46 am
April 28, 2023

Currently, DeepSeek is fully accessible through OpenRouter in both CyberSEO Pro and RSS Retriever. We always recommend using OpenRouter to access any AI model. It's incredibly convenient because it allows you to use all the popular models - OpenAI, Anthropic, Google, DeepSeek, Grok, LLaMA, Mistral etc - with a single API key and unified billing. In addition, OpenRouter acts as a proxy, bypassing IP-based restrictions (many models are not available in certain countries) and removing limitations on the number of API calls (https://www.cyberseo.net/blog/openrouter-one-key-to-rule-them-all/).
In CyberSEO Pro and RSS Retriever, you can use DeepSeek simply by specifying the model ID in your shortcode. No additional plugin modifications are required as OpenRouter handles everything on its end.
As for AI Autoblogger, the situation is a bit different. Models in AI Autoblogger are selected from a predefined list, and DeepSeek is not included at this time. This decision was made after testing both versions of the model:
- While DeepSeek R1 (full model) provides high-quality output, it is extremely slow, taking about 3 minutes to generate a single section of an article. This performance makes it unsuitable for autoblogging, where efficiency and speed are critical.
- The DeepSeek R1 Distill LLaMA 70B (distilled version) model is faster, but falls short of OpenAI GPT-4o mini and Anthropic Claude Sonnet 3.5 in terms of output quality. It produces shorter texts and sometimes contains formatting errors. For example, even when explicitly instructed to output content in HTML, it may inject Markdown into the generated code, while we need consistently high-quality articles that do not require manual corrections.
At this point, it's unclear whether the full DeepSeek R1 model or the distilled version is a better fit for AI Autoblogger. The full version is too slow and more expensive than the OpenAI GPT-4o mini, while the distilled version doesn't yet meet the quality standards necessary for content generation. Perhaps it's best to wait and see how DeepSeek develops in the future before considering native integration with AI Autoblogger.
For now, I'd recommend using DeepSeek through OpenRouter in CyberSEO Pro and RSS Retriever if you want to experiment with this model.
9:27 am

October 13, 2024

9:38 am
April 28, 2023

You are welcome! Unfortunately, the DeepSeek model raises some questions for us. Please have a look at my post: https://www.cyberseo.net/forum/cyberseo-plugin/deepseek-integration/#p9911
In short, there is something shady about this model, and perhaps we are all in for some unpleasant discoveries. For example, I would not be surprised if the claim that the model was trained for only $5 million is fake. The cost seems implausibly low for a model of this size, unless they had access to pre-trained weights or used an external dataset generated by other AI models.
We don't have definitive proof yet, but if the model explicitly identifies itself as OpenAI's ChatGPT, there's a strong possibility that the developers behind DeepSeek used OpenAI-generated output as training data. However, there is also a more troubling possibility that the model may have been trained using illegally obtained OpenAI weights or code. Given that we have already seen several leaks of AI model weights over the past two years, including the suspected unauthorized use of Western AI technologies by some Chinese companies, this scenario cannot be ruled out.
The fact that this behavior no longer appears in DeepSeek R1 suggests that they became aware of the problem and corrected it. Whether this correction was made to fix an unintentional bug or to cover up something more questionable remains an open question. Either way, the initial behavior of DeepSeek V3 casts serious doubt on the true origins of this model.
Login to see this link
Most Users Ever Online: 541
Currently Online:
4 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 86
s.baryshev.aoasp: 68
Freedom: 61
harboot: 56
Pandermos: 54
MediFormatica: 49
Member Stats:
Guest Posters: 337
Members: 2923
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1681
Posts: 8580
Newest Members:
apoc.signup, trananhb1, info.houstonyoungprofessionals, shelley.dbq, jim.limberis, fatihgungor133Administrators: CyberSEO: 4049