Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Photoshop combined with Firefly is exactly the rare kind of good tooling I'm talking about. In/outpainting was found to be working for creatives in practice, and got added to Photoshop.

>The idea that somehow “AI isn’t art directable” is one I keep hearing, but I remain unconvinced this is somehow an unsolvable problem.

That's not my point. AI can be perfectly directable and usable, just not in the specific form DE3/MJ do it. Text prompts alone don't have enough semantic capacity to guide it for useful purposes, and the tools they have (img2img, basic in/outpainting) aren't enough for production.

In contrast, Stable Diffusion has a myriad of non-textual tools around it right now - style/concept/object transfer of all sorts, live painting, skeleton-based character posing, neural rendering, conceptual sliders that can be created at will, lighting control, video rotoscoping, etc. And plugins for existing digital painting and 3D software leveraging all this witchcraft.

All this is extremely experimental and janky right now. It will be figured out in the upcoming years, though. (if only community's brains weren't deep fried by porn...) This is exactly the sort of tooling the industry needs to get shit done.



Ah ok yes I agree. How many years is really the million dollar question. I’ve begun to act as if it’s around 5 years and sometimes I think I’m being too conservative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: