There is a new AI Text-to-Video model out. This can generate Video from Text. However, it needs a lot of work before this model can generate HD video. But this is only the beginning. It can do an OK job, but the online model is never available when I try to use it. This needs a massive server farm of Nvidia A-100 GPU cards this would provide more power for the model to work. An NVIDIA A100 TensorĀ Core GPU would be ideal for generating quality video. An 80-gigabyte model is the best value for creating a high-resolution video using AI. Creating a video is far more intensive than creating a simple picture. Maybe if the home user has a massively powerful GPU and many of them, they could try and run it at home.
See it here: https://huggingface.co/spaces/damo-vilab/modelscope-text-to-video-synthesis.
The local version you can try and set it up at home: https://github.com/deforum-art/sd-webui-modelscope-text2video
See Nvidia-smi information below.
Tue Mar 21 14:35:47 2023 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 525.85.12 Driver Version: 525.85.12 CUDA Version: 12.0 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 | | N/A 69C P0 31W / 70W | 0MiB / 15360MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+ |
You can get an Nvidia Tesla T4 for only $5,000. This is more affordable than a Nvidia A-100. It might work pretty well. If coupled with a 4090.