Don't forget to have it search http://stackoverflow.com in the various php tags. Also, having it search the various framework site's articles, question sites, documentation, etc would be very useful as well.
EDIT - For those who can't get the page to load, here are my results:
* grapheme_strstr
(PHP 5 >= 5.3.0, PECL intl >= 1.0.0) grapheme_strstr — Returns part of haystack string from the first occurrence of needle to the end of haystack.
string grapheme_strstr( string $haystack, string $needle[, bool $before_needle = false] )
* grapheme_stristr
(PHP 5 >= 5.3.0, PECL intl >= 1.0.0) grapheme_stristr — Returns part of haystack string from the first occurrence of case-insensitive needle to the end of haystack.
string grapheme_stristr( string $haystack, string $needle[, bool $before_needle = false] )
* iconv_strrpos
(PHP 5) iconv_strrpos — Finds the last occurrence of a needle within a haystack
int iconv_strrpos( string $haystack, string $needle[, string $charset = ini_set("iconv.internal_encoding")] )
* iconv_strpos
(PHP 5) iconv_strpos — Finds position of first occurrence of a needle within a haystack
int iconv_strpos( string $haystack, string $needle[, int $offset = 0[, string $charset = ini_set("iconv.internal_encoding")]] )
Very nice fade in effect when results are shown - soothing and not too jarring.
I'd suggest that when no results are found that a message is displayed to that affect. There is currently no feedback to tell if the system is still loading, broken or has found no results at the moment.
Suggestion: It appears that the searches sometimes stacked up in queue if I typed fast. (i.e. if I typed a long sentence quick the results continued to flash different content many times, playing catch-up. Looked through the JS quick and I see you are trying to prevent this with runningRequest1/2 so maybe it's the short delay in the .fadeTo() after the GET callback. Try moving the runningRequest = false into the fadeTo callback.
Also, .fadeTo() has a default queue of it's own.
You could also add .stop() like this:
Here's another: you're searching on any keyup which creates lots of overhead. Google's implementation is more complex. It appears they search immediately as you type the first few characters on each keyup/down then if the word goes over a few characters it switches to querying every couple seconds, or until you hit the spacebar. It also seems like they have another timer if you pause after a second or two.
If it's just you using this, I would say don't give a crap. But if you plan to allow others to use it and it gains a userbase at all you might consider a similar approach for obvious reasons.
You may want to put in something that only searches if they havn't pressed a key after so many seconds.
I would assume that most of us type at a fairly constant rate so maybe check each keystroke to see how long its been since the last one and maybe if 500 ms have passed, then go ahead and fire off some search results.
Try using http://code.google.com/p/jquery-debounce/ - it creates a throttle/debounce function for you, so the submit would only be called at most every X ms, no matter how often the event fires. It's super useful.
Fast. I wanted to do something like this for code-completion, but with tagclouds. Still do in fact.
In searchforphp.com, function names are search-completed alphabetically. Could you sort them by their usage? Such as token frequency in a corpus of actual PHP source code?
Actually I was looking into doing that. I just need to get a large enough corpus to crunch on. I will probably download a few of the larger CMS's and frameworks and do this.
Yes its something I need to fix. Im thinking for the very first search ill display a "loading" image, because after that the results fade out a little when you start searching again.
The speed is probably due to the site being hammered. Its only a small VPS and probably not coping very well. Thanks for the feedback though!
Sorry everyone about the problems with load. It should be somewhat more responsive now. It was down to me leaving KeepAlive on in the httpd.conf for Apache.
Neat idea, especially to a fellow non-IDE using full-time PHP coder such as myself. That said, I tried the first search that came to mind
"cast to string"
And didn't really get any useful results. First result was for JSON, and the link from php.net was for stripos.
I actually did this with Google code for a while, but removed it due to speed issues. I will have a look at the github api though and see if it is any faster.
this is simple and much like google.com which i like, no advertisements or wordy introductions, and the main function is bright and center. the ui could use a bit of a makeover, but i think you've got something functional and should keep expanding the functionality before investing time on the ui part since it seems that you're the only developer at the moment.
what storage solution do you use for indexing?
and perhaps i'm getting penalized by the latency since i'm in shanghai, but it seems that the response time is pretty slow.
Thanks! UI is not my strength, but thats all I could come up with.
The indexing is very much a hybrid consisting of sphinx, mysql fulltext, and a custom solution of python and php.
As for the speed, the site is being hammered... since its only on a little VPS its not coping too well at the moment. I am located in Sydney Australia and its a little slow for me at the moment too. Usually its not too bad though.
Yes I have. Most of the people I work with are asking for a dotnet one so I am planning on doing that first, but yes Python is on the radar, and mostly done.
Thanks! Would really appreciate feedback as I improve things. Im pushing almost daily updates at this point, just need to know if they are useful or not.
Fair enough. I will be aiming to improve relevance for functions since that seems to be the main feedback.
One of those cases where the way I search for functions is not the way everyone else does. I now have a lot of data to allow me to really improve things. The function relevance is very high on the list now.
Can i ask how you are "getting" the data? Are you compiling the data locally (i would assume via scraping/APIs) and putting it in a full text search engine or are you hitting various services for every search (via APIs?) for every search?
love the site -- its simplicity and bareness have a certain charm.