Start to make my own modules with AI,
Hope you find these modules works.
Also VCV version is in my Github, but only MAC ARM-64 version now, PC version coming soon.
Thanks.
MAD.
Start to make my own modules with AI,
Hope you find these modules works.
Also VCV version is in my Github, but only MAC ARM-64 version now, PC version coming soon.
Thanks.
MAD.
Thanks - can you share any details about what these are?
9 modules, from the Github manuals, I have not worked out what āObserverā does yet with 8 tracks in and time and trigger parameters.
Madzine | Adgenerator | three-track Attack-Decay envelope generator with dual processing modes | ||||
---|---|---|---|---|---|---|
Madzine | Euclidean Rhythm Module | Euclidean Rhythm Module | ||||
Madzine | Maddy | an integrated sequencer combining swing clock generation, 3-track Euclidean rhythm generator, and pattern-based CV sequencing | ||||
Madzine | Observer | |||||
Madzine | PPaTTTerning Module | pattern-based CV sequencer with style and density control. | ||||
Madzine | Pinpple Module | a ping filter synthesizer with dynamic FM modulation capabilities | ||||
Madzine | Q_Q Module | a three-track simplified envelope generator with real-time oscilloscope display functionality. | ||||
Madzine | SwingLFO | a dual-phase LFO with swing and shape control | ||||
Madzine | TWNO Module | a dual-track Euclidean rhythm generator for techno sequences |
Sure, my modules are:
SwingLFO: LFO with Swing mix, Pulse Width, Saw Shape function
Euclidean Rhythm: 3 Channel Euclidean with per channel clock Div/Mult and chain output
ADGenerator: 3 channel Trigger AD env, or 3 channel Follower with BPF
Pinpple: a drum sound based on Ripples with built in Noise ad LPG-ish FM
PPaTTTerning: 5 step seq with different algorithm to create long pattern
MADDY: swing lfo+euclidean+patterning
TWNC: 2 Drum sound synth with Euclidean seq, env output
Q_Q: S-curve Decay trigger env designed for drum sounds
Observer: 8-ch scope
Thanking you with at least 20 characters.
Thanks - looking good there!
Thanks for posting them. Which AI tools did yiu use to create them?
I use Claude. Itās amazing because I have zero coding knowledge.
updated:
Removed oversampling function in TWNC
Im pretty interested in AI tools, so had a quick play with https://claude.ai last night.
itās pretty good for vcv rack, Im quite surprised that its been trained against vcv rack code. Id have thought this was quite ānicheā.
as an experiment, I tried to get it to create code for another eurorack module where I supplied it the (much less popular) API, and frankly, its results were dreadful even though I supplied it a repository (mine) with a huge number of examples.
it just seemed to do a very shallow look at the code, then made huge assumptions.
so, obviously vcv rack code (and I noticed vsts) have had some deeper learning done on the code base/documentation.
I also tried chatGPT and noticed I got quite similar results to Claude - albeit it has a slightly different approach to its āfirst attemptā, its start simple, and makes suggestions about what you could add iteratively.
note: I will say in both cases, I didnt compile the code , I just did a quick code inspection, and the code looked viable.
in my experience, usually AI generates compilable code, but can be a bit buggy in edge cases, or rather āunexpectedā.
anyways, its kind of interesting , as a a lot of vcv rack code is pretty much āboilerplateā stuff, so I can see why it works.
intriguing, as a dev, Im still playing with how AI can be used as a tool to augment the development process.
as an aside, for others, perhaps you might want to share more details on how you do this e.g. queries you use etc, and also your general experience.
note:
Im leaving aside the ethics of AI here, given a lot of musicians have deep concerns about the use of AI creations being generated off the back of others work etc, and its not too different in the programming space
I would be very interested in this kind of walkthrough!
Let me try to put it simple.
So I know zero coding, I gave Claude the VCV-SDK site and ask Claude to go through it. Mostly everything happening in terminal. Usually AI couldnāt achieve my goal so I need to find some references.
For example my SwingLFO is actually one LFO creating 2 phase and mix together. My ADGenerator and clock divider in euclidean rhythm is based on DHE stage curve and bog audio RGate.
It always work like this: I started with one simple function, debug until the function work, then add next function. The thing is we need know which module has the reference we need and where is wrong.
For Example I aways need to tell it:
the knob move too slow.
I need to adjust the knob range, tell me where and how I should change the code
Can you explain how does this knob working? (This is very useful to know we are thinking the same thing)
btw I fixed my MADDY seq this morning, please try it out, thanks
interestingā¦
so I noticed both chatGPT and Claudia both know about the VCV sdk, they dont need a reference for this.
so you can pretty much start with ā¦
ācreate me a VCV rack module that implements a sine oscillatorā
(and it seems to give valid code)
also from previous experimenting with chatGPT, I can say there are pretty well versed in dsp theory and how to implement algorithms.
one thing I noticed different about chatGPT vs Claudia is
Claudia seemed to add more āfeaturesā out of the box i.e. functions you might expect on such a module, but could be considered āoptionalā
whereas chatGPT gives you something simple, and just talks about features you might want to add.
however, being AI, you can then tell it to add the features.
whats nice here, is it describes the changes made and why, its trying to be āeducationalā.
this seems similar to how you describe using Claudia.
note: I only tried a few things, so this may or not always be the case.
I will say the VCV rack SDK is pretty easy to use, so Im sure after a few example changes explained by AI. people will quickly understand whats going on.
if anyone is gets āinto thisā, you might also want to look into VSCode and CoPilot.
VSCode is an easy to use development environment, and copilot is an AI extension which generates/inspects code.
given how well chatGPT/Claudia are doing, Im pretty sure copilot would do well too,
its main advantage is its working on code directly on your computer, so the workflow is more streamlined, also it takes into account any edits you make locally.
anyway, very interesting to hear about how a non coder is using this stuff. quite an exciting time.
p.s. donāt know if you prefer this discussion was a separate topic or not, if so Im sure dann can split it for you. your decision, your plugins
Iām actually surprised people are into this topic, thanks for reply.
I actually only try Claude with coding. Just like it being humble usually. And yes Claude did give me advise. For example in my module āPPaTTTerningā the step spread function āDensityā and the Jump mode is Claudeās idea.
The down side through AI coding I think is they couldnāt āseeā things. So something like UI layout takes many hours to adjust. Also I remember I took 3hours to adjust the S-curve in my QQ envelope.
Here or separate topic Iām fine with either.
I think the issue of AI coding of modules could be groundbreaking with respect to development of modules for the MM. Perhaps @danngreen can think about adding a category to the forum that is dedicated to the topic. As with all things AI, it can be used for good (or not so good) outcomes. It would be nice to have some reasonable standards in place for the development or publishing of AI plug-ins, including making sure they are relatively āready for prime timeā when published for public use and that documentation is provided regarding the plug-ins features and use. Perhaps all of that can be discussed and developed by the community in a dedicated section of the forum.
I did see that Andrew Belt has issued a strongly worded statement on AI in the VCV Community Forum that folks should be mindful of if they are publishing plugins in VCV (Generative AI content (text, audio, images, video) is banned - Forum Feedback - VCV Community).
Iām excited to see what happens in the AI space with respect to plug-in development, and if some reasonable standards are put in place it could be a very exciting development with respect to the use cases for the MM.
Cool, Iāll try these out!
Re: using AI: To be on our site, all plugins must meet minimum standards outlined in the SDK docs, basically they must have accurate meta-data, meet licensing requirements, and must pass the CPU test without crashing (being run and timed at various block sizes). Iām thinking of adding some criteria about avoiding generic names for jacks/controls⦠and also extending the CPU load tests to test more param positions, but for now thatās what we have.
Thereās also a tacit standard of CPU load numbers, e.g. if your module is similar to another existing module but yours uses 2x the CPU, then itās not likely to show up in many patches. Iād be curious to see how AI does with being asked to optimize (it might need to know about the hardware architecture, e.g. Cortex-A7 with NEON extensions)
Anyways, if a module meets that criteria then as far as weāre concerned, it doesnāt matter if AI helped or not, or if you copy/pasted code from Stack Overflow or from another module (assuming of course you have permission/license to use the code you used). I do agree with the sentiment of Andrew Beltās quote:
⦠though at least at this point, I donāt take it so far to officially ban unreviewed code. Regardless of where the code came form (even something you wrote yourself years ago) you should always review it before using it. Thatās just good practice.
For sure AI coding tools are great learning tools, and congrats @mmmmmmmadman for sticking with the process to get something working.
I would also love to see an in-depth hand-holding walkthrough on creating a module.
This aligns with one of the missions of the MetaModule: to make it easy for a wide range of people to create their own custom music-making hardware. Even if itās just for personal use, for someone with no/little coding experience to make their own custom instrument is pretty exciting.
updated:
Now my module can run on Mac ARM64, x64, and Win x64 VCV Rack.
And if anyone in Berlin feels like to try make a module with AI with me, I think Iām happy to try and film a video to share.
updated,
TWNC cv in, MADDY clock source repaired.
I would also love to see an in-depth hand-holding walkthrough on creating a module.
2nd! i too would love to see a ā0 to heroā set of tutorials showing how to create modules for mm