Thanks! It's sad that this is the answer, but I understand. I wish sometime we can access the base models of GPT-4 without the 'As an AI language model' guardrails!
That's not going to happen. But it's likely that StableLM 175B will rival GPT-4.
Also, you can finetune Base StableLM yourself on any consumer GPU with 8GB of VRAM in a couple of hours and it will be commercial licensed. (using https://github.com/johnsmith0031/alpaca_lora_4bit)
You can even use the exact same dataset StabilityAI used. (Although there are better ones, with more GPT-4 data.)