TLDR: solutions to reduce CPU load on single module patches. Currently at 48/512
Hi I purchased the Metamodule right when it came out and it was one of the initial reasons I got into Eurorack. Immediately I ran into CPU issues just trying to run single module patches, back then it was the Clouds and Rings clones and figured it was just because it was still very early on in dev so I shelved it for months.
Broke it out again early summer and was still having the same issues, a single module running at 48 / 512 overloading the cpu with just a minimal amount of external modulation.
After reading about all the new updates and everyone’s seemingly very stable patches I decided to break it out again and even 3d printed a case just for it and the wifi expander figuring the CPU load issue would be a problem of the past but just loaded in a patch containing the black noise cosmos, tides and the CV funk hammer clock and am hitting max CPU again. I don’'t get it, the example patches people are uploading are much more intense and I really feel like I should be able to run just able any supported module at 48/512 without maxing out.
My very long winded / roundabout question: Is there anything anyone here can think of that would cause this / solutions? Does completely wiping the module help? Any other suggestions? Possible initial batches are built on slower hardware?
cpu load is highly dependent on modules used, and how they are patched (e.g. modulating some cv/parameters may induce more load)
you can get an idea of heavy / light modules from the cpu load spreadsheet
also I see you up the block size, this will often not reduce cpu load by a huge amount. basically blocks size reduces overhead a bit, and reduces per block calculations but that is very module dependent. at 48k the module still has to produce 48000 samplers per second, regardless of block size.
increasing tends to be better at ‘smoothing out’ cpu load i.e. gets rid of periodic glitches (aka overruns)
(ofc, reducing sample rate will often dramatically reduce cpu load - at the cost of fidelity)
tl;dr; there is no systemic bug, or hardware issue.
ofc, doesn’t mean optimisation could not be possible, but even there id be amazed if that would yield huge improvements.
for me, I view this more as ‘working within constraints’ of the hardware .
we have a huge choice of modules, some use more cpu, some less. with the right module selection a huge amount is possible , and often with little / no compromise in results.
its different from creating patches on a Mac/pc where we can throw anything and everything at it, as they have so much computational power.
The OG Mutable instruments Clouds topped out @32kHz, if you are using the Meta as a single module emulator perhaps reducing its sampling to match will overcome your problem
I dont know of any possible explanation, but that doesn’t seem right.
I just loaded up a saved patch that has 2 Resonators, 2 Granulation (= Rings and Clouds), 1 QPLFO, 1 QCD and a couple of utilities.
At 48//128 it is showing 74%
It sounds weird to me too. I think you should post a simple patch that overloads for you, let us know what modulation you’re using when it overloads, and see if other people get the same result.
Under advice from etcetc, I also checked the screen connector because I was having intermittent screen black outs. So I re seated that connector, wiped my sd and usb drives and reinstalled the newest firmware and plugin set and it seems like everything is running good now / I’m able to run patches without overloading the CPU. Not sure why the screen connector would cause the issue or if it was sometime going one with the previous sd/usb firmware or plugin set but happy it’s working now. Thanks for all your suggestions and help.