Hacker News new | past | comments | ask | show | jobs | submit login
Run CLIP on iPhone to search photos (mazzzystar.github.io)
397 points by mazzystar on Feb 7, 2023 | hide | past | favorite | 134 comments



I love this app, particularly because of the frustration I had with Apple Photos and how bad its search is compared to Google Photos. Playing around with, here are some notes:

- The layout on my 2020 iPhone SE is a bit messed up, the first picture overlaps the search bar which is a bit frustrating

- I am not sure the app uses geolocation. Searching for Paris, it itendified some pictures with typical parisian architecture, but not all. Is it actually possible to include geolocation data? (EDIT: ok, read your other comment about it. Regarding servers, the app is already 300 MB, i think you could hardcode a list of like 10k places (e.g. a circle of e.g. 5km radius around the lat/long center of paris, a circle of e.g. 100km around the lat/long center of a country (or multiple circles).)

- I very much like the privacy conscious no-network-connection aspect.

- PLEASE (if possible) include a link to the original picture in my Photos app, so this way the workflow would be: I search a picture -> I open it in Photos app where I can see the other ones taken around the same time. If that is not possible (which I could imagine is the case), Your workaround of displaying the date and time of the picture is probably the next best thing.

Thanks for making this app!


Thanks for your review. I'm new to iOS, so some of the values for the UI layout are hard-coded, and have only tested it on my iPhone 12 mini and iPhone 14 Pro simulator. Thanks for telling me about its display problem on iPhone SE, I will try to fix it in the next version.

Regarding geolocation, the photo information is available in latitude and longitude, but converting them to specific cities requires access to a geolocation conversion service, which means an internet connection is required, which is not what I would like to see.

Jumping to albums from the app is not allowed, but you can find the 'i' icon at the bottom of the photo detail page, click it and you can see the date information.


Some more field feedback:

App works beautifully and layout is beautiful on latest iPad Pro 12.9", including in the new iPadOS windowing where you can change the window to arbitrary sizes.

Stage Manager: https://support.apple.com/guide/ipad/move-resize-and-organiz...

The app also indexed 160,000 photos in a tiny window while running other things (Safari, Teams, Outlook, etc.)

On the overall use case, I found results are on point, far better for descriptions of photos. It only takes about 9 seconds to find good results from the 160K photos.


Glad to hear that. I only have 35,000 photos myself and a search takes about 3 seconds (on iPhone 12 mini). Will consider subsequent speedup optimizations for search.


you dont need that, check out the implementation here https://github.com/richardpenman/reverse_geocode/ for example


Thank you for the link!


Out of interest, are you using SwiftUI or UIKit? With SwiftUI, the layout you have right now should be very easy to write and will work on all displays, if you are not using it that could be worth looking into.

> but converting them to specific cities requires access to a geolocation conversion service

I made an edit to my original comment, you could store certain areas (circle with radius / polygon) and check if given coordindates in that area, but its not something that is crucial for this app to be useful.

> but you can find the 'i' icon at the bottom of the photo detail page, click it and you can see the date information

Yes I saw that, its quite useful, a shame that Apple restricts jumping directly into Photos.


1.Layout I'm using SwiftUI, but I'm having trouble with `Form`(you can see that the shadows below the search box have expanded into the search results area). Will try to solve these problems.

2.Geolocation Yes, I have this plan too: store a large geographic mapping table of major cities and then find the closest known city by some calculations.

3.Date Information haha I think so too.


I have a sqlite database created from open street maps with 6m places all geohashed if you want to add it to the system.


I'm a little afraid that the geographic information file will be too large and further increase the size of the app, so I'll try the simple, but imprecise way first, and if I can't do that I'll come to you, thanks!


It is large but you could probably download it post install if someone wants place names for their photos.


But the most important purpose of this app is not to network lol, so it seems to be necessary to think of other ways.


That's why I landed on a SQLite database too :). It could be drastically reduced by filtering out unneeded locations. 6m is a lot.


I searched for "dog" and it found dogs including in cartoons but it also tagged photos of myself as a dog as well. Not sure if it's due to the facial hair or some other trait I share with dogs.


It is using clip embeddings and showing you nearest neighbors, not tagging them or identifying anything in the photo.


Update: UI layout has been optimized to a flexiable way in v1.2.0


I did this last year as a proof of concept. Glad someone is doing it more seriously.

https://twitter.com/getrememberwhen

I recently added faces to it and have it setup for geo searches but too little time to work on it.

You can't link to photos in the photo app sadly.


Wow! There are people in the world trying to do almost exactly the same thing as me. I'll try your testflight version, thanks for letting me know the existence of "remember when?"

At that time last year, I had just had an idea, but was quickly overwhelmed by work. Until I quit my job recently, I had full time to do development. If possible, we may work on something together in the future :)


The latest version uses too much memory so some people have issues with it getting killed. Haven't gotten around to fixing the issue though.


iPhone SE overlap issue has been fixed in new version(v1.2.0). I have submitted a review and you can expect to see this version within 24 hours.


Nice work! I've made a similar demo as a web app [1] and obtained a few orders of magnitude faster performance, in the 50ms for 1.6M images on single laptop core. I've limited experience with coreml but there's a few tricks that could probably help you too? If you feel like hacking on it together shoot me an email :) (in profile)

[1] https://paulw.tokyo/post/real-time-semantic-search-demo/


Thank you, I took a brief look and it's fantastic! I'll take a closer look tomorrow! (It's late at night here.) Would love to work with you if possible :)


What did you use as the index structure for the nearest neighbor search? Did you use a library like FAISS or Annoy?


No, just calculate the cosine similarity one by one in the simplest way.


Very nice! I immensely appreciate the "no network connections whatsoever" approach, and search quality looks useful so far.

I have a problem however: the app only sees a fraction of my images. I do not use iCloud, but backup and sync locally with macOS.

Photos.app/sync is configured to copy & delete photos from iOS to Mac, then sync them back through a "all photos of the last two years" smart album.

Those photos appear in iOS Photos, but seem to be invisible to Queryable - it only sees those photos taken after my last backup & sync. Directly after backup and sync, it sees no photos at all.

I think I've seen that once before in another app, so that is probably a bug in iOS/macOS and nothing you can fix, but I thought I'd let you know anyway.


Thank you for letting me know! I will check it out.


Please do take a look at this. It's how I sync and store photos as well. Still happy to pay and support your development!


I'll take a look, but I don't necessarily have the ability to solve it (will try)


If you need logs, more info, someone to test - drop me a line at tent_optics_0d@icloud.com.


I don't buy apps, but this was an insta-purchase.

I like Apple’s photo search, I use it daily and can't complain much about it (except it thinks "chicken" only refers to the fried variety).

Queryable can find live chickens I'm happy already.

One thing I'd really need though is the ability to open the picture in the Photos app after it find one.


As soon as I realized what this was, I instantly dropped $2 to get it. Excellent work!


This is very very good.

I use Google Photos on my iPhone just that I can do searches like this on my library. (And that I can auto-share photos with my family, half of them are on Androids.)

This is on some queries much better than that. Which is amazing, given that it runs locally.

It still doesn’t have the face recognition and the geographical information that Google Photos have.


Thank you for liking it. Face recognition involves more models, and about geolocation information, you can actually get latitude and longitude from photos, but converting them to specific cities and neighborhoods requires access to geolocation servers, which will cause the app to have to be networked, which is not what I would like to see.


For Geo-Location, maybe consider this workflow: User searches City in the iPhone Maps App (by Apple), then clicks "Share" button, then "Copy to Clipboard" - thenpaste this location into your app, you grab Longitude & Latitude from that and let users specify a radius. Bam! Complicated.... But no network needed.


Someone else has commented with a list of cities which could be included offline: https://news.ycombinator.com/item?id=34690693

But your workflow is such a hassle that I doubt many people would do it. A more realistic flow (without the list of places above) would be to show an image of the world map, allow the user to zoom in, and let them drop a pin on the map. They can drop a pin near some pixels labeled Paris, but the app doesn't need to know that it's Paris, it just needs to translate the pixel coordinates of the pin (relative to the image) to geographical coordinates, and look for images close to these coordinates.


Potentially the app could link out to a hosted webpage that allows selecting a location using a decent map and search UI, with network capability (but without requiring it in app)

Then on selecting a location you present a button that then jumps you back into the app via a universal link (passing in lon/lat)


Is it a possibility to have some sort of offline database to query against ? Or maybe that would make the app to heavy. Could be a "opt-in" download.


Maybe an offline mapping table + a bit of similarity algorithm can roughly solve this problem, just not too precise for cities.


Why not? How many cities do you think there are?

Including a dump of just lat/lon/rad->city from openstreetmap probably kilobytes. Including towns and features might be low megabytes.


Maybe you're right, I'll try.


Mapbox added offline map support a few years back on their SDKs, Found a couple medium articles on it, but no code to share. Wondering if there might be any mapbox FOSS replacements that offer this?


My current idea is probably: store the latitude and longitude of major cities in the world in a txt file, load it, and then estimate the distance based on the difference between them and the current photo, not sure about the space taken and the time consumed.


If I was you, I would not try to add the geolocation, you won’t match google anyway and it will take you way too much time.

But up to you


Thank you for making it a paid app instead of ad-supported. Got my copy!


In fact I would prefer it to be free + in-app purchase format, at least let users experience it first. But this requires app networking.


Why no networking? I guess Apple requires authors to jump through extra hoops to prove that nothing personal is being sent out?


They do, they also require that any server you connect to is IPV6 capable because they run all their testing on an IPV6-only network.

It's just more hoops to jump through, so simply avoiding it is best if you can.


Makes sense.


This is really cool!

The page starts "I built an app called Queryable, which integrates the CLIP model on iOS to search the Photos album OFFLINE" - just to clarify, the Apple Photos default search runs offline too (and does the indexing offline). Google Photos search needs a connection, Apple Photos doesn't.


Thank you for clarifying that. By emphasizing 'offline' is not mean to compare with other products, but to dispel readers' privacy concerns.


Nice blog post.

I wonder if it's possible to speed up the search with something like https://github.com/google-research/google-research/tree/mast...

Also kind of surprising that something like this is not officially supported already! In my books that means this is a Good Idea


Have the same intention as you. I found ScanNN's search super fast when I was doing retrival based model. But I'm not sure about the feasibility of porting this model to iOS, especially since the app size will be larger with two models stacked and the current speed is tolerable.


My friends work on the SCaNN team!

ScaNN doesn't use a second ML model, it's just an efficient way to store a bag of vectors. But if the software is too heavyweight for you, you don't necessarily need all of the vector quantized multi-level trees and bit twiddling tricks, you can implement 20% of the work for 80% of the speed gain.

Here are a few super simple approaches:

- Random projection: instead of doing the search in 1024-dimensional space, randomly project your vectors down to 16 dimensions and do the search in that space instead. Objects that are far away in this smaller subspace are at least that far in the original space, so you can use this heuristic to prune most of the dataset away; then, you can rank the closest items using the full nearest-neighbor search to get exact results.

Dead simple, lossless heuristic, speedup factor is (new dimensionality) / (old dimensionality).

- Locality-sensitive hashing: in addition to storing the vector representation, store its hash. This can be quite simple, e.g. random projection LSH converts the vector into a series of bits, according to which side of a random hyperplane that vector falls on; see https://www.pinecone.io/learn/locality-sensitive-hashing-ran... Unfortunately, this will be "lossy" - some images will be missed if they fall into different hash bins, and far images may be hashed to the same value if the region w/ equivalent hash is thin/narrow/oddly shaped.

More complicated, lossy results, can search unlimited images in constant time.

- Multi-tree lookup: break the space down into a KD-tree and search that instead. Complicated, and it stops working in fairly high dimensions (certainly don't use this on 64-dimensional vectors or higher)

ScaNN integrates most of these techniques, plus a few more arcane approaches that depend on processor-tuned heuristics. The gory details are available in the team's ICML2020 paper, see http://proceedings.mlr.press/v119/guo20h/guo20h.pdf


I recall trying to integrate it into my SDK, a c++ SDK without luck, I also remember trying to write an article how to integrate my SDK with vertix vector search, man it so complex I need to write a book, milvus and openseaech was just a blog post... UX is important


I really appreciate your reply! The several methods you mentioned are great, I can see at first glance that the "random projection" method will work very well, I will investigate other methods you provide, thank you again for your great ideas!


I tried integrating hnswlib (header-only C) with my app and it wasn't trivial. It probably isn't necessary unless you are searching millions of vectors or have very low latency requirements.


What do you use to index image vectors? Does a "retrieval based modal" not require an index?


The "retrieval based model" refers to https://github.com/CompVis/latent-diffusion#retrieval-augmen..., which uses ScaNN to train a knn embedding searcher.


Excellent app! Bought it, just by seeing your demo :) Still indexing 35k of 64k photos. - Is it able to search photos "like this"? So when I open / find a photo, is there a function that says "find photos like this"? :)


It's a brilliant idea to find images by text and then find other similar images by images, it's not supported now but I'll try.


Thank you for considering :) - I know I have hundreds of photos from one scene or a device or an object, across several years, but iOS photos is not able to find similar objects on photos, just faces. It would be very useful to be able to find those similar photos :)


Feature added. In the latest version(v1.2.1) of Queryable, you can find more similar photos by a specific photo result. Have fun : )


Wow, that’s awesome! Just found all the renovation photos of our apartment taken across weeks. - Thank you very much for impelementing it :)


This is awesome! Could you share more details on how you’re storing the image embeddings and performing the KNN search? Is there an on-device vector database for iOS?


Thanks for your attention. I did not use any database, but stored the embedding calculations as Object files. When the user opens the app, the program preload them and turn them into `MLMultiArray`. when the user searches, these `MLMultiArray` are traversed and the similarity is calculated separately.


Congrats on the launch! I appreciate the simplicity of the design and your privacy-consciousness. Purchased right away. iOS desperately needs this.

I was similarly frustrated with Photos and worked on a similar idea. My first prototype used CLIP-as-service running on MacOS which is quite fast but meant for building web services. I eventually got the concept running on iOS using CLIP via CoreML but still have some problems to solve. I've faced many of the challenges described in this post. Feel free to reach out if you want to compare notes. Best of luck!

By the way, I encountered an issue on my first load. sorry, can't send a screenshot at the moment. In the auth flow, I granted permission only for a subset of photos (about 200) and now I see "Total -1 Photos need to be indexed". So maybe there's something wrong with that case. Also, the user should be able to update permissions from settings after the first time, in case they want to allow more or less access in the future.


"described in this post" I'm interested in your posting, is there a link?

Also, Queryable doesn't seem to support granting permissions to only some albums at the moment, and while I know that sounds silly, I really don't currently deal with this issue. You can uninstall and then reinstall, grant full photo access (if you want), and try again.


That makes sense and thanks for the solution. And sorry, my comment was not written well. By "post" I meant "thread". If I write anything up I'll be sure to share with you. Congrats again, this is impressive!


This worked really well for me. Example search “snowman with hair dryer” found this cartoon (which I have locally on my phone). It wasn’t the first hit but was in the top 5 or so.

https://www.allposters.com/-sp/Carl-No-New-Yorker-Cartoon-Po...

Nice work!


Yes, my initial purpose was also just to find a certain photo, and the sorting is probably not the most important thing. The reason for enlarging the first photo is that sometimes there are very funny results ranked first that users can screenshot and share.


Kudos for the project! I wonder why Apple themselves can't do something like this. iOS Photo app search is abysmal.


Big companies are always a little slower, and I believe Apple should be integrated into Photos in iOS 17/18.


Works great! Would love to have the feature to find similar photos, something like this: https://rom1504.github.io/clip-retrieval/


Now in the latest version(v1.2.1), you can find more similar photos by a spcific photo results. Have fun : )


Will try to add that feature in the next version :)


Bought the app. Tells me it doesn’t run on iPhone XS. Any particular reason for this? Any chance you can or will make it work?


From the QA on the same page:

3. Any requirements for the device?

    iOS 16.0 or above
    iPhone 11 (A13 chip) or later models


Sorry to have wasted your time. The reason for not supporting iPhone X series is not known to me for now (it's user feedback that the they're having problems running), please request a refund and I will get back to you if I fix this issue in the future.


I bought it as well, knowing that it would not work on my XS. I figure that once I upgrade my phone I'll want Queryable, and 2 bucks now is worth saving the trouble of trying to find the application later.

Anyways I want to report a minor bug, where the text displayed on the screen saying that the XS isn't supported runs off the left and right margins of the screen rather than wrapping. The displayed text is "ble does not support iPhone X/Xr/Xs, please req"


This should come from a weird setting: the iPhone X simulator is not determined to be an iPhone X, resulting in not actually displaying the line. Thank you, it will be fixed in the next version.


I bought it as well, saw the 'no XS' message, and downloaded the app on my iPad (Pro, 12.9"' A12Z). Indexing worked fine, but when I search for anything, I just see a large black square. Any reason for not supporting iPadOS?

Edit: I saw further down that only A13 and up are supported, even though I did not get a warning about running on an A12. Bummer.


Sorry for wasting your time, I feel a little guilty hearing this, please request a refund.


No worries, finally got it working on my partner's iPhone 13. We have a family account, so the purchase was not wasted.


Thank you for letting me know about this.


Don't worry about a refund. It's fine. But if you ever do get it working, it would be great if you could ping me.


I've bought it - I would also buy a desktop verson if you're interested in making one.


I use Mac Catalyst to convert iOS apps directly to Mac apps, so Mac(only tested on M series) is currently supported. But the UI should not look good (sorry I haven't had time to optimize it), and it only supports searching for photos in the "Photos" app.


Thank you! I can confirm it doesn't work on Intel Macs. But it does indeed work on my M1 laptop. This is going to be a huge timesaver.


It does actually work on Intel macs - albeit very slowly. I left the process on in the background and my computer kept locking up. Once I realised what was causing the lock ups - I checked the process and it had indexed a very small number of the photos.


Thank your for letting me know that.


This is a great app. I've also been looking for a better image search for iPhone.

The only thing that I think could be improved - allow loading original images in the search results. If you are using iCloud to store the images then only the small cached versions are returned.


In the latest version, you can run the app to network in order to download iCloud photos, but this will only happen when. 1. the photo is indeed stored in iCloud 2. Your search results include this photo 3. You went to the photo details page and clicked the Download from iCloud icon

Only then will Queryable request network permissions in a pop-up window. After granting it, you kill the app, reopen it, and it will automatically download from iCloud.


Yeah you are actually right, thanks. Didn't notice it.


Can it use the person names on the iPhone? I mean, the iPhone already does a splendid job recognizing faces. If the app can access that and combine the scene description with persons and other properties of the photos would be fantastic.


This should require an additional model to do face recognition, but even if this feature is added, it still requires additional user input of 'who is this', which should be better suited for Apple to do on its own.


The poster is asking if you can access the Apple face matching results on this, perhaps via the contacts API.


Ah, I get it! Will give it a go.


This is awesome! Took maybe 5 minutes (didn't time it, sorry) to build the index for 20k photos on my 14 Pro, and about two seconds to execute a search.

Once a photo is found, is it possible to jump to that photo in the Photos app?


Thanks for your feedback. But jumping to photo album is not allowed by Apple. You can find it in albums by date via the 'i' icon at the bottom of the photo detail page, but I think maybe in the future it will be possible to display other photos taken at the same time directly based on date information.


Would it be possible to add an option to not search photos in the Hidden album?


Currently it use "smart album" to get all photos, will check out if users could block some specific albums.


When you search for something, and click on a photo, it would be nice if the search query was visible while looking at the photo (I'm taking screenshots of the app and sending them to people, so the context helps!)


Sounds like a good idea! I'm new to iOS and will try to add this feature in the next release :)


Bought it. It’s very good. Indexed my 15,000 photos in a few mins (iPhone 14). Search seems fast, a few seconds only.

Am I missing a way to jump from the photo preview to the photo in the Photos app?


Glad to hear your feedback! However, Apple does not allow third-party apps to jump to Photos app.


How much space does this take ~per 1000 images?

Does this work with a terabyte of offloaded iCloud images? Does it generally work with offloaded images?


The embedding data should be no more than 10mb. I didn't understand the second question, it will resize the image to a size of 224×224 and then process.


Say on my 64 gigabyte iPhone, I have stored 50GB of images, but in iCloud I have stored 1000 gigabyte of images. I can access them seamlessly from the photo app, but does your app analyse those pictures? It would lead to a download of 1000 gigabyte of pictures as well.


Your question is answered at the end of the post in the QA section, "what if my pictures are stored in iCloud". The answer is yes.


that's great. Can I have a linux version, same simple UI, please? are paid plugins ok for darktable use?


any chance for macOS and especially the console version?


I used Mac Catalyst to convert iOS apps directly to Mac apps, so Mac is currently supported. But I only tested it on my M2 Macbook Air (M series should be fine), and the current version only supports searching photos in the "Photos" app, it may support searching pictures of the whole hard disk in the future.


I'd pay for a(n) (Intel) Mac version that can index an arbitrary folder of images.

It's something I've been playing with myself off and on, but at a much poorer level than your implementation.


android?


Sorry, I'm an algorithm engineer who just learned SwiftUI development, so I don't know anything about Android development, but I think there should be a deep learning framework on Android similar to Apple's Core ML.


oh okay no worries


More examples of commercially significant fair use! Keep it up!

Or would anyone like to argue that this tool is a violation of someone’s rights when their copyright protected works were used to train the CLIP models and that they should be compensated?


You're kind of arguing against a straw man.

The people who don't like these models don't like the ones that are posed to replace artists. They are not really worried about stuff like clip groups images together.

I do believe that makes their copyright argument slightly bad faith. They only valid argument in this context would be that they are angry that stable diffusion is distributing their images, because training on their images clearly hasn't been an issue for at least a decade.


That is of course the point I am making.

I don’t see how it is a straw man at all. CLIP is an integral part of Stable Diffusion and where all of the artist’s names are embedded. It has the exact same issues with fair use as any other model that has trained on data without permission.

The courts have already made a distinction between the tool and the outputs of the tool, starting with Sony v Universal.

The SD defense will absolutely use programs such as this iOS CLIP app to show “commercially significant fair use”.

Someone can’t say that a model is fair use when used for natural language search of their own photos but then that same model is not fair use when used to create a new image. The model is either fair use or not. The outputs are either infringing or not, and this assessment is not based on the tool used whatsoever.


This is disappointing. I just purchased the app and it opened, but I got an error: https://imgur.com/a/6MKVqq4

Edit - I saw in "Settings" screen that there is no support for iPhone XS and that I need to request a refund. I've done that now.

Luckily Apple makes it super easy to get refunds!

I'm excited to try the app once it starts working on my device.


Sorry to waste your time, I don't know how to block iPhone X/Xr/Xs and other devices with chips below A13.


Do you specifically need A13? There's a UIRequiredDeviceCapabilities that goes in your Info.plist for "phone-ipad-minimum-performance-a12", but not one for exactly A13…

https://developer.apple.com/documentation/bundleresources/in...


Yes, I didn't know that I could set "phone-ipad-minimum-performance-a12" when I first developed it, so I was not allowed to add this restriction to the next version (Apple thought it would prevent some users who had downloaded it before from using it), but this restriction still includes the iPhone X series, so I had to remind them in the app users.


I don't know if your app just runs slowly on A12 and lower or if it doesn't run at all. But if it's the latter, perhaps you could detect the UIDevice().type on start and show an alert explaining the situation to the users?

I would also consider putting this information to the very top of the description to try to avoid bad reviews.

---

One more feedback: I struggled to find the App Store link in your blog article. I would definitely recommend using Apple's well known marketing badge[0], you can still enhance it with your brand if you want to.

Congrats on the app and good luck!

[0] https://developer.apple.com/app-store/marketing/guidelines/


Wow so if I understand correctly: you’ll need to allow downloads from those phones forever, because of a small number of initial downloads? That’s quite an App Store policy


apps usually introduce requirements based on software versions, which eventually will exclude hardware


It seems a little strange that it doesn't work for the A12, it could be a performance issue, if the UI doesn't update for a set amount of time, an app will crash out, this can be solved by putting the offending blocking code to a background thread and adding an activity indicator to the main thread until the job is done.


Some users will crash while building the index, some build the index normally, but the search results are the same for any search term.


I can't remember what the timeout value is, but say the A12 puts some users on the limit, say due to library size. That could be the issue.

(I'm sorry, I haven't dowloaded the app so I'm guessing on your UI, just know its a fairly common mistake) If you're just putting up a dialog saying "building index", running the code to build the index in the same thread, then dismissing once done, it will lock out UI updates, eventually leading to a crash (I think it's 30 secs for the timeout).

Running code in a high priority background thread with a callback to the main thread is fairly trivial in iOS, there's the fine grained control of GCD, Combine for react like code, and the new async/await pattern.

A UIActivityIndicator can be used to show activity without knowing progress.


Thank you for letting me know about this case, I will check it out :)


No worries, I will use this as an excuse to buy a new iPhone 14 Pro or similar. Been needing an app like this for years.


The website clearly states the requirements:

- iOS 16.0 or above

- iPhone 11 (A13 chip) or later models


That’s not how it work for mobile platforms. Developers usually set the targets correctly, so incompatible devices can’t install the app, rather than relying on the users to read some text in some website (not appstore where they install it).


While true, is there a way to achieve this?

AFAIK you cannot set arbitrary compatible devices. You can only set UIRequiredDeviceCapabilities, which are quite limited[0]. There is a required capability for `iphone-ipad-minimum-performance-a12`, but not for A13.

And also if you already have the app in the App Store, you can only relax these requirements. You cannot add new ones as that could prevent users who already bought the app from launching it.

BTW the minimum requirements are listed in Queryable's App Store description as well, although I would consider putting them on the very top.

[0] - https://developer.apple.com/documentation/bundleresources/in...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: