It looks more like it's re-skinning a room/space rather than trying different design ideas - all the elements remain where they are, but are restyled.
I think a big contribution of an interior designer is their ability to see potential in a space and reorganize it in a way that works better for a purpose / feels better to be in. This in comparison seems "skin deep"
But part of the value of a designer is they’d know what prompts to even give. They’d understand your needs, price points, and walk you through a space of directions.
I wonder if this is better suited as a tool for designers themselves to quickly create drafts to demonstrate their direction. But only using stable diffision worries me that the kinds of products they’d want to leverage will be limited.
This looks to be stable diffusion with controlnet. Since many of the original lines are kept it is probably Canny Edge or HED. That's why it looks so similar to the original room.
I love how people will a) criticize OpenAI/etc. for restricting access to models and "aligning" their output, and b) immediately leap to shaming models that aren't "aligned".
You cannot just add a new window like that, or change the view you have. Also the furniture that is generated doesn't exist as it is generated as well, so how are you actually supposed to buy it?
Take the designs to a craftsman? If AI inspires a resurgence in craftspeople using their skills to create truly bespoke pieces, that'd be a very good thing.
For this app in particular sure. But in general I expect a massive uptick in people playing around with telling image AI's "make a modern looking table and chairs with sharp geometric designs and..." then refining the concept and taking that as an inspiration portfolio to a craftsman. I wouldn't be interested in paying a designer to just come up with ideas that I may not even like, but if I have something I know I like and a designer/craftsman can refine/realize it, that's a compelling proposition.
> I wouldn't be interested in paying a designer to just come up with ideas that I may not even like
The benefit of having a designer is that you can have input at every stage of the process. Pick some swatches out, pick some styles, pick a budget. You aren't just throwing money into the void and getting a human designed living space. Iteration is part of the process.
I'm sure AI will evolve and this will be a different conversation, of course.
Just because a process is iterative doesn't mean the result will be ideal, or even satisfactory. You're limited by the capabilities of the individual designer, and you'd better believe a decent one will want payment even if you sit down for a few hours but end up not coming up with an item that really speaks to you.
If you can start the conversation with an tangible pectoral representation of what you want the end product to be, you're much better off.
Cool way to imagine a room in a different way, and get ideas for a mood board you can hand to a designer. Of course, most of the stuff it imagines does not actually exist, and could not be purchased off the shelf. You'd need a lot of custom cabinetry and furniture work, next to which hiring a designer in the first place would probably not be much of an expense.
It'd be nice if there was a way to say what is and is not changeable.
For example, I noticed it usually improves the view outside of the windows to be beautiful vistas -- not something you can do without moving. Being able to request that other things not change, like the floor plan or the flooring for example, would be nice.
I like how the AI will randomly turn furniture into a dog, as if to say, "improve this room with a furry friend." :)
Please do! Adding in a Segment Anything Model UI to the app would be great. You can then click on the items you want removed and use the segments as inpainting masks for the generative model.
Leveraging multiple models here (Segment Anything is a good candidate) seems like a must to build a defensible product. Using just SD with basic control net and maybe some additional finetuning makes it very cloneable. If you leverage other models you can do things like identify what elements of the room must be preserved (walls, windows) and what can be safely removed or edited. This could allow you to generate a simulated image of the room without furniture as a base. Derive a 3d projection to allow the user to place basic 3d furniture stubs within the space. Then use those stubs to create subject masks with the furniture-less base to render a final room.
Tried it on my basement and the generated picture wasn't even close to relevant. I think the issue was that I have a ton of junk in my basement and the generated image had no idea about what was junk and what was important in the room. It closed off a window, made my ceiling taller, and added a giant window in a place where it's just not possible.
Cool! Just tried it out with my partner (in the process of designing their living room in a new place).
UX suggestion: let me select a new style without having to re-upload the photo. We wanted to check out a bunch of styles on the same photo, but doing so took 10 clicks instead of just 1!
Yes, certainly. The image generation process utilizes a variation of the text-to-image model called StableDiffusion. The frontend collects requests and sends them to a two-layered backend. The first layer determines the appropriate prompt to send to the worker, while the second layer calls the image generation API. Once the image is generated, it is sent back to the first layer and then returned to the client.
Really cool. My girlfriend is into interior design and I've been showing her projects like this as they come along: in her view these are some of the best results yet. A few first impressions though:
- They need to add a filter to the uploaded images (or at very least curate the front page), maybe some kind of classifier that determines if people are in it. This kid [1] is all over the examples if you click "load more". Absolutely hilarious, but I'm sure not what they're going for.
- The model loves to change "boring" exteriors into scenic vistas. Brick wall? -> Cityscape. Trees? -> Ocean view. Yes, that does improve the room, but I'm not sure moving your room to beachfront property or up 40 stories is practical for most people.
- The model seems to have a bias towards making rooms darker. Dark/moody scenes are usually not how designers choose to show off their work, or what most people want.
Hi, thanks for the feedback! NSFW image filter will be added into the pipeline, and the upcoming version will allow you to modify results using prompts.
I think a big contribution of an interior designer is their ability to see potential in a space and reorganize it in a way that works better for a purpose / feels better to be in. This in comparison seems "skin deep"