Add DeepSeek v3
engels74
yellow
Since my Windsurf credits have run out, I tried Roo-Cline+Deepseek v3 today to do some real work. I found it was much better than Deepseek v2.5 when I tested it, and it was very fast, even faster than Claude 3.5, which was beyond my imagination. And I saw that it can be deployed by itself on hugging face. I think it is a very good choice to replace the current cascade base with DeepSeek-V3, even using its API supports Claude 1/30 of the price. The development team should really consider this model seriously, without prejudice.
https://huggingface.co/deepseek-ai/DeepSeek-V3
k stallion
What is with peoples weird obsession with deepseek v3, its such a crappy model.
B
Bhavesh
I see these posts come on too often, please add xxxx but I think everyone is missing the point what codeium does. It actually provides additional context on the server side. I noticed this because it was using the latest API for a python module that was just released 2-3 weeks ago. I am mostly using the base cascade and its doing well. the problem with deepseek is that you're expecting 1 shot resolves.
It's ok to do that, but I prefer multishot to get it resolved and your prompting needs to be good and elaborated.
I use Cline mostly for debugging tests and leave it on over night on several branches to fix several features. I then merge the next morning. it cost me $0.14 to fix around 10 complex unit tests when mocking several functions using deepseek or gemma.
N
Netanel Bezalel
And it is so easy to implement - it's exactly as Open AI API
M
Mihai Cosma
I wonder if it would be possible to re-base Cascade onto Deepseek as the base model instead of Llama �