Hacker News new | past | comments | ask | show | jobs | submit login
Wiki.js (wiki.js.org)
381 points by akandiah on July 21, 2020 | hide | past | favorite | 199 comments



I need one function in a wiki platform that I haven't seen so far. When I write text, in its WYSIWYG editor, I need an ability to paste in an image (a screenshot that I just grabbed, let's say) and for it to automatically upload it and embed it into text. Does this support something like that?


Just tested it for you.

It seems like they actually support it in Wiki.js however it requires you to first click "insert assets" and once that modal displays [0], you can actually paste into the page and it will be uploaded.

Not too different than JIRA really. But I feel like this feature would be improved if pasting into the editor itself yielded the same result.

[0] https://i.imgur.com/o04m6on.png


MediaWiki VisualEditor supports it: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide#...

"You can upload images from a tab in the media dialog, or by dragging and dropping a file into the editor, or by pasting an image from your clipboard. [...] The image will be inserted into the page when you are done."


Now, it is fair to complain that VisualEditor is difficult to get running on your own local Mediawiki instance, as it requires the suite of local node.js microservers (RestBase and Parsoid) that aren't part of the core PHP platform.

That said, this is about to change! The upcoming Mediawiki 1.35 is supposed to move Parsoid into core PHP, and so VisualEditor is going to become a lot more default-accessible. :D


This. Thousand times this.

I have been waiting for an online wiki with the usability of Apple Notes that I use locally on all my Apple devices. It works like a charm except that I cannot make it public.

This is why I often take notes rather than write blog posts on my website. If the wiki software was as easy drag and drop as Apple Notes, I’d just take notes and they turn into publicly available wikis!

I am yet to find that tool. I would happily pay for such a tool with one braking condition that it must be self-hostable. I will not write my content into something like Medium or Notion where I don’t own my content.



Kontxt (https://kontxt.io) lets you write inline highlights, comments, polls, etc, on digital content with permission based sharing. You can share full articles with your responses on it (example: https://www.kontxt.io/proxy/https://www.independent.co.uk/li...) or a summary view (example: https://www.kontxt.io/document/d/CChEtxTRf9lt3IZhuTSw1zES8Dg...). Soon we'll add a WYSIWYG editor that you can write your own content and references content from other annotated sources.

Is this the type of functionality you're looking for if it could be self-hosted? How much would you be willing to pay for such a tool?


This is technically feasible. The latest html5 api (quite well adopted) allow copy paste from the OS and drag and drop. There is quite a bit of server side / javascript to implement, but it is feasible.


I will announce something like this soon:

- ui: joplin. - agpl, fully self-hostable. - you own your content (because joplin). - choose among free templates, or create your own. - templates will be similar or compatible to hugo, still tbd.

Optional for paying customers:

- sync via webdav to my service. - custom domain. - backups, etc.

Once this starts generating money, I am planning to spend some of it to fund e2e per-folder encryption in joplin.


Though its more of a building block, Editor.js supports this: https://editorjs.io


XWiki:

* CKEditor: https://extensions.xwiki.org/xwiki/bin/view/Extension/CKEdit...

* Syntax: https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide...

Actually any wiki pages can define a Class and how to display this class and / or instances of a Class:

* https://www.xwiki.org/xwiki/bin/view/Documentation/DevGuide/...

* https://extensions.xwiki.org/xwiki/bin/view/Extension/App%20...

The Script Macro is useful to make some dashboards ( https://extensions.xwiki.org/xwiki/bin/view/Extension/Script... )

I've deployed this for the internal documentation inside a company I worked for (MediaWiki was a no-go even with a visual editor).

For each new feature, I was developing inside a clean new wiki, than I exported the changes once I was sure everything was okay. It is way more easy then to upgrade to the new XWiki version.


Outline (https://www.getoutline.com) supports this functionality.


Oh wow, this looks great! I was pleasantly surprised that this is open-source and can be swift-hosted too.


Note that Outline's definition of "open source" diverges quite a bit from the one that most people have in mind when they hear the words "open source".

From the text of Outline's license [1]:

    Notice

    The Business Source License (this document, or the “License”) is not an Open
    Source license. However, the Licensed Work will eventually be made available
    under an Open Source License, as stated in this License.
I think it is more accurate to say that Outline will be open source, rather than Outline is Open Source.

[1]: https://raw.githubusercontent.com/outline/outline/develop/LI...


Since you can write arbitrary JS in extensions, no reason why that couldn't be implemented by a library.

Since I just got into MediaWiki and write my first extension (finally a dark-mode that works), I'll see if this can be implemented. Perhaps with https://www.mediawiki.org/wiki/API:Upload.


Oh, it does work for MediaWiki, you just need the "Add Image" widget open in the editor.


Confluence supports this, although its not free!


Not sure if relevant, but Notion does support this feature.


This is exciting. A compelling FOSS alternative to Atlassian Confluence was sorely needed.

Mediawiki has some UX and RBAC challenges that makes it difficult to scale to large organizations.


Since it's AGPL it will probably never end up in the same commercial use cases as Confluence does.

Google has some motivations written down from their lawyer department: https://opensource.google/docs/using/agpl-policy/

It boils down to 'not worth the risk, do not use'.


OT, as the sole copyright owner of a web application I wrote, can I license it under the AGPL for the general public, and license it under different terms for clients who would prefer not to use AGPL software (and possibly pay for it) ? is dual licensing allowed by the AGPL ?


Yes. But keep in mind that the copyright of any contributions you might get will be held by the author of that contribution, not by you. The contributor might license that contribution under the AGPL only, and thereby put a stop to your little scheme. You would have to either get a license from the contributor to your dual licensing scheme, or simply require all contributors to sign the copyright over to you, which is most commonly done in a so-called Contributor Licensing Agreement (CLA). However, CLAs are rightly disparaged for being unfair, in that it lets you, and only you, profit and monopolize what might become a community effort. A requirement of a CLA will most certainly discourage many, if not most, potential contributors from contributing.


I knew about this, that's why I said "sole copyright owner". The goal is to put the code out there and restrict other from modifying it without publishing their changes to their users. But I want to keep the option of licensing it differently for specific (b2b) clients. It's a not a true open source / free software in spirit as I don't plan to accept outside contributions.


You seem to be assuming that you are the only person who will ever contribute meaningfully to the software. But if, as you probably also hope, the software is widespead and successful, there will most probably arise at least one popular third-party extension/modification. And you will not be able to incorporate this addition into the product which you distribute to your b2b customers; this will make your version of the software inferior to the free version.


The other option is that could simply pay community members some fair portion of the b2b revenue in exchange for signing the CLA. If you're the primary contributor, it's fair to take most of the revenue for yourself, but revenue sharing with major contributors from the community is also fair. Could be difficult tax-wise though? IDK.


The most difficult thing might be to have all parties agree to what, exactly, would constitute a fair arrangement. Free software where nobody gets directly paid is one thing which is easy to understand, but if money gets involved, things tend to get ugly, and nothing destroys a budding community faster than bitter infighting.


Don't let the other commenter dissuade you. There are many successful open source AGPL projects that have CLAs which are not widely disparaged. There's no shame in owning your creation, nor is it immoral to ask contributors to assign their copyright to you. To each their own. Good luck!


If you're the sole copyright owner, you can offer as many different licenses as you want, and no copyleft license can interfere with that.


That's not how licensing works. If you write the software, you can release it under any license(s) you want. The license states the terms under which you'll let others use the software.


That's what I thought but wasn't sure. Thank you (and sibling comments) for your comment.


Yes, you can licence it however you want because you are the copyright holder. See mongodb. IANAL. Licences are to restrict those who do not hold copyright.


> Licences are to restrict those who do not hold copyright.

On the contrary, licenses gives permission to those who do not hold copyright. Without any license, only the copyright holder has any rights to copy or modify the work in any way (except for fair use). A license gives permissions to non-copyright holders to do things which would otherwise be illegal under copyright law. A license can never restrict what anyone would otherwise be allowed to do, since it is not a contract.


That's not relevant for using wiki.js.


Is the suggestion here that, say I was in the business of selling proprietary software, by running Wiki.js and making it accessible to the public that the copyleft license would then apply to my proprietary software?


If you package it with your software, then that might be the case unless you had negotiated other licencing terms first.

Simply hosting it with your information in would not have any such effect, but many commercial entities avoid anything xGPL just-in-case. In this case perhaps because they see a time that they might later want to package and distribute documentation that is in the wiki without converting it to something else first.

There is an extra concern with AGPL that does not exist with GPL specifically because of its key difference. AGPL applies to hosting the software and making it available not just distributing a compiled form. Some interpret this as meaning that if it is hosted on the same server, or in the same site, as other software then that other software becomes AGPL licensed too. I doubt anyone would enforce this interpretation but the possibility is enough to put off those who create proprietary software.

> making it accessible to the public that the copyleft license would then apply to my proprietary software?

Not just the public. Anyone you give access to, so for non-public hosted proprietary software you could be beholden to giving them access to the code under the AGPL in situations where AGPL applies. This will be a complete blocked for many creators of proprietary, or other non-*GPL licensed, software.

[if the above makes me sound against AGPL rest assured that I am not - I in fact might end up using it at least initially (at least until I decide upon which of the more proprietary-friendly options to use) for some near-future projects]


So for complete clarification:

If I hosted a wiki.js, say on a subdomain, unrelated to a commercial application's code at all, but lined to it (e.g. the "docs" section of navigation) would I need to disclose

a) the source code of the wiki.js subdomain only, and any modifications we make to it

b) the source code of wiki.js and the other proprietary app BOTH ?


A, unless you package and distribute the code of wiki.js in some way with the application - but it sounds like you are talking about a hosted only solution so that won't be an issue.

If you allow self-hosting of your proprietary code by your users and the refers to your documentation server than this would still not trigger AGPL for the other code - you are not distributing wiki.js or parts of it. If you allow self hosting of the other code and include wiki.js for a local copy of your documentation then APGL might trigger.

Linking is meaning software linking, not referencing your own content displayed by the software. Think of it as the same difference between including using chunks of a paper in your own paper, or saying "as discussed in This Paper About This Thing (Him, Her, et. al, 2013)...". Or, as in this case the content is your's, referencing your own paper.

This potential confusion with the word "linking" (as natural languages are fairly dynamic beasts) is why people are sometimes overly fearful of AGPL and GPL (and often even LGPL) software.


Linking as in binary linking: yes (option b). Linking as in HTML A href link: no, only option a.

Problem is that unless you are installing your servers by hand you end up re-packaging wiki.js and then your own packaging code and the modification on wiki.js as a build artefact are now supposed to be public as well.


I don't understand what you mean about packaging code.

Do you mean to distinguish between setting up an EC2 instance or digitalocean droplet type of thing versus using ansible/chef/etc orchestration among the servers? Implying you would have to publish the configs for all your servers?

That would make no sense, both from a practical perspective (publicly disclose server configs? those can include sensitive information), and a theoretical perspective (what does that have to do with the source code at all?).


> large organizations

Interestingly Amazon just finished a multi-year effort to migrate off MediaWiki internally to comply with an infosec mandate that PHP is banned company wide.


Source?


Quick search gives me an HN comment from 2014, without any sources either... https://news.ycombinator.com/item?id=7440811


Why is PHP banned?


"Some"? The fact that you have to manually access the database backend in order to change a password is a joke, among many.


If its your own password you can change from the web interface. If it is someone else's password you can send a password reset email from the web interface.

To actually set someone else's password to a specific value does require running a command line script (not the same as going into db). In my view that is a reasonable security-convinence trade off.

In any case, i would assume a large org would use a single-sign-on extension and not mediawiki's native user management, which would make MW's password management moot.

Disclaimer: am mediawiki developer


There's a bunch of ways which don't involve manual database access? https://www.mediawiki.org/wiki/Manual:Resetting_passwords


I'm looking to launch an internal wiki and Wiki.js came out on top for my requirements:

- easy to use for technical and non-technical staff alike: multiple editing options

- third party authentication: really comprehensive offering

- quality search: comprehensive internal and third party search offering

- ease of maintenance: largely everything is built-in, so no module/dependency maintenance headaches

- user management: solid user/group management system

With internal tools you need things to stick, and fast. As much as I am fond of mediawiki, the editing experience is a barrier to usage for many. And the extension ecosystem, while rich and diverse, is just more of a liability than a single installation. A quality search is also really important to adoption, so having options there is great.

I'd been using Docsify on a small scale with authentication through GitLab to edit, GitLab CD to build and Cloudflare Access to secure the front end. It works really well, but the lack of user management and the editing experience mean that it's time to move on.

It would be great to hear if this is a case of the grass always being greener on the other side.


I'm not impressed. Wiki.js is supposedly "built with performance in mind", but its documentation wiki [1] is much slower than any DokuWiki site I could find [2]. It also requires JavaScript to be enabled in the web browser.

[1]: https://docs.requarks.io/

[2]: https://www.dokuwiki.org/


I find it really frustrating that every piece of software nowadays claims to be "blazing fast" or "built for performance", usually with no benchmarks to back it up. Makes it really hard to tell at a glance what the strengths of a project actually are. I honestly would be very grateful if a project up and said "we're not the fastest, but we trade performance for a simpler codebase and easier extensibility. If you need to do some-performance-intensive-task, try other-package instead".


I also don't like their theme choice, especially the "Table of Contents" is fixed and wasteful, and combined with navigation column they used 40% of the screen wide. I can't concentrate to the content because the other half is distracting.


docs.requarks.io, which is said to be using Wiki.js, straight up doesn't load without Javascript, and even with Javascript enabled it's a multi-page application that just feels slower browsing page to page than your average 10-year-old mediawiki install (probably also heavier on the backend).

Who exactly is asking for slower software?


> Wiki.js, straight up doesn't load without Javascript

I'm sorry but it's right there in the name


Because it uses Node.js in the backend. MediaWiki is written in PHP, but it does not require you to have PHP running in the client to work.

As much as we could argue about whether no-js support really matters or not in 2020, the fact remains that having to having to load Vue.js and have it parse and render the frontend on the client is not really "lightweight", especially when the most popular competing products pre-render on the backend.

It wouldn't honestly be that much of an issue if it was a SPA (and it must do SPA-ish things already if it uses Apollo) so you just load the frontend once and it'd load other pages asynchronously, but nope. Every link is a full page reload, with Vue having to re-do everything every time.

It just reeks of modern tech used in an old fashioned way, which ends up with the performance penalties from both.

Go browse any other wiki, see how much faster and smoother the experience is.


well sometimes not being a spa and using vuejs/"any spa tech" might be a good idea. if it would be scoped correctly. sometimes you need MORE interaction on a specific page. unfortunatly the whole wiki feels sluggish as you already said, because they make so much dumb stuff.

I would say using js without having a no-js version is ok, if done correctly.


It also requires a browser that supports HTML for some reason, instead of being an API you communicate with through CURL POST requests.


I was hoping for a tty interface over telnet with plain ascii output.


Other technologies are slow and insecure, and I have disabled them on my system. Too much memory usage.

Please ensure that if you mention your service or system on HN it works over telnet and outputs plain ascii.


I was excited for a moment when wiki.js.org rendered properly without Javascript but similarly disappointed when docs.requarks.io only showed a fairly typical white page.

Also noticed that it feels slow page to page, thinking that it might be an issue with my Firefox configuration I opened the page on a fresh profile and it's still slow, but if you open it in Chrome page to page becomes almost instant and is more comparable to MediaWiki. So maybe this particular performance issue on FF can be resolved but it does seem like a worse end user experience when compared to MediaWiki.


yeah I've noticed in Firefox Wiki.js appears to blank in between each page (doesn't happen for me in other browsers). This is very disruptive and has prevented me from further interest in the platform, sadly.


Was excited about seeing an open alternative to Gitbook.

But yes, not loading without Javascript is a showstopper.


> Running on the blazing fast Node.js engine

Is Node.js that blazing-fast ?


It certainly can be, depending on what you compare it against and how you measure.

It is based on V8 which in many (caveat: far from all) benchmarks comes out as the fastest JS engine, so by that definition in the realm of JS powered components it is pretty quick.

node.js solutions will often perform better than common configurations of other options too - Apache+PHP to pick one example out of the air. Then again, depending on the code other configurations of PHP might outperform Node.


Surprisingly yes. It has billions of dollars or optimizations and research in it over the years and can really fly. Of course you can write slow node code just like you can in any language, but a single well-written node instance can handle a lot of traffic.


Billions of dollars? I have no beef with Node, but that seems like a lot. Could you elaborate?


We needed a documentation solution at work. MY coworker had some experience with Wiki.js. What sold me on it was that you can use markdown and it can keep itself synced with a git repo.

This gives us plan text files that are tracked in a repo. It uses the user as the author, so now I can "code review" edit's to our wiki.

The content of the wiki is easily cloned by cloning the git repo. It is markdown in folders so if wiki.js dies at some point I could write a pandoc script to turn it into web pages again, you do loose all of the cool UI features.


I used a Wiki for a long time. But I try to minimise maintenance ("foist it upon others"). I also try to resist the enthusiasm for Rube Goldberg machines and for installing bad tooling (such as PHP).

Now that git has become ubiquitous, I prefer git with a self-hosted git-daemon instance. git , grep , awk , and sqlite make a strong set of tools for knowledge curation.

edit: minor grammar fix


Unnecessarily bashing PHP is soo 2005. Like javascript, it can be written poorly due to its loose roots. Also just like javascript, it is a very different language now.

That said, DocuWiki is pretty decent to get up and running quickly.


I use PHP in my day job. It is by far the worst language I've ever used. Nothing else even comes close. The language and ecosystem are so full of footguns that you are bound to shoot yourself eventually. The OpenSSL implementation will silently truncate the key [1][2] without even giving a warning. The cURL lib, in 2020, still hasn't implemented a get_curl_opt function. Sure you could wrap it if you're writing everything, but the reality is I have to work in this nightmare ecosystem that just uses raw curl. Every == comparison is still a potential security hole due to PHP's insane (and inconsistent) typecasting behaviors. Sometimes a number gets cast to a string, but a string gets cast to a number if you use it as an array key. WTF? Do I have to wait another 15 years for PHP to become a halfway decent language?

[1] https://github.com/WP2Static/wp2static/pull/506

[2] https://stackoverflow.com/questions/55062897/decrypt-aes256-...


> The language and ecosystem are so full of footguns

"footgun" is an excellent term. Here's an earlier use on HN, 2010:

https://news.ycombinator.com/item?id=1904960


I've wondered whether the term directly evolved from this sort of joke list that was popular to pass around on the early internet: http://www.personal.psu.edu/sxt104/program1.html


I assume it evolved not long after guns existed, and likely grew in the military :)


"Shooting yourself in the foot," sure, but if you look up "footgun" it seems to only be programming slang. What I was suggesting with the above comment was that this sort of list might have popularized the metaphor of shooting yourself in the foot in programming, which was then was subjected to hacker-style word manipulation.


Ah fair, yep, that's probably right.


> Like javascript, it can be written poorly due to its loose roots.

I understand that bad code can be expressed in every language. But there is bad tooling too.

PHP clearly has a lucrative place in the world. But it remains a significant threat vector.

Yes, even in 202x. I leave others to discuss why this is the case. I won't install PHP on a workstation just to run a Wiki. ^_^


I wonder why it's AGPL and not dual-licensed or some different GPL. As it is right now it's dead in the water for any commercial usage unless you're manually installing the thing on a manually installed server somewhere (which you probably aren't).

With automation you'd build images based on their images but run via your own CI/CD with your own security scans and any additions you might need (like additional logging infrastructure). Doing that is not possible with AGPL.


At least for me, the fact that it is purely licensed under the AGPL and that the copyright is owned by multiple people makes me far more comfortable with using it. It's a guarantee that the project will remain open source so I don't have to worry about suddenly being in a situation where I have to migrate away because the company or person decided that they don't want to have this be free and open source software anymore.

I guess, to a certain extent, that's because I'm an individual, not a company, and one that tends to open source pretty much everything they write. This is the same licensing that I use for pretty much all my projects (AGPL with no CLA).


> It's a guarantee that the project will remain open source

What are you talking about? They can change the license to a closed one from a certain version in the future.


> They can change the license to a closed one from a certain version in the future.

You're right if and only if by "they" you mean every copyright holder whose contributions would exist in the future version (including, say, the contributions of the very person you're responding to). But if by "they" you mean the project leaders acting without the cooperation of everyone who holds copyright, then that's a no.


It was easy to tell it's almost one man's job if you look at the level of integrity the software has.

https://github.com/Requarks/wiki/graphs/contributors

Main guy commiting 600k lines and the second most committed guy 450 lines.

So yeah, it wouldn't take him a whole lot of time if he really wanted to change the license by removing all the others' commits and rewriting it by himself.

Also, what does AGPL has anything to do with keeping the license open sourced?


> Main guy commiting 600k lines and the second most committed guy 450 lines.

> So yeah, it wouldn't take him a whole lot of time if he really wanted to change the license by removing all the others' commits and rewriting it by himself.

Good point. I may try to get involved in its development as well to spread that out a little more.

> Also, what does AGPL has anything to do with keeping the license open sourced?

The AGPL requires that any code linked to it also be distributed with an AGPL license (the difference from the GPL being that hosting over a network counts as distribution). Every part of the project is technically linked to itself so if someone makes a change, no one else can use that change in the project under a different license.


For future versions, not past versions.


I don't think any of the mainstream open-source licenses allow you to retroactively revoke or change the license.


If they are the sole copyright owners (no external contribution) or have SLAs, they can for any future version of the software. It is not uncommon, it is just hard as most doesn't have SLA to do this.


> If they are the sole copyright owners (no external contribution)

That's not the scenario gary-kim laid out; you've failed to satisfy the constraints in the premise.


Read better, I'm not replying to him. I agree with him.


Reply better.

I know who you're replying to. The premise that gary-kim laid out is still the relevant context. The hypothetical you're laying out, on the other hand, is not relevant, it's at odds with that premise (not in "agree[ment] with him"), and it's derailing the thread. (Which is the same reason your "future versions, not past versions" is downvoted, for that matter.)


Right. _they_ were not meant to be Wiki.js, but ANY open source project. And was a meant to indicate the same subject as _any_ in this reply:

> I don't think _any_ of the mainstream open-source licenses allow you to retroactively revoke or change the license.

My reply is in this context, not in the parent's parent which you mean as _relevant_ context. If I wanted to include the parent context the reply would be more specific. This was a direct reply to a specific message. This is a very normal way to reply on the internet, HN is not special.


Try rewriting history if you want, but the thread goes off-topic as soon as mekster suggests the project change the license, and your reply there only feeds into it. And it still doesn't explain how you can claim that your comment was meant to "agree" with gary-kim's.

> My reply is in this context, not in the parent's parent

The parent's parent at that point is... your comment, "For future versions, not past versions," which was off-topic.

> This is a very normal way to reply on the internet

Indeed, it's common for people to lose the plot in the comments section and then get defensive (and smug) while being wrong, e.g.:

https://news.ycombinator.com/item?id=23250829


> mekster suggests the project change the license

When have I suggested the project change the license??? I only said it is possible.


[flagged]


That's HN's anti-flamewar mechanism kicking in to get people to slow down and stop firing off comments from the hip.


Not sure why I'm getting downvoted, but this does not change the fact that yes, licenses can be changed. You disliking it does not make it wrong.


> With automation you'd build images based on their images but run via your own CI/CD with your own security scans and any additions you might need (like additional logging infrastructure). Doing that is not possible with AGPL.

None of that is impossible with AGPL.


> As it is right now it's dead in the water for any commercial usage

So what? If companies need a certain software, they can pay for it. I remember a time when FOSS was not about providing companies with free work, quite the opposite indeed.


It is presented as being enterprise-ish, for organisations (of which a lot are commercial) and presents features usually useful for large companies. That's what.

This isn't about good/bad or something like that, just an odd presentation that doesn't seem to be in line with the license. There is nobody to pay here to use this stuff because you still won't be able to integrate it without also sharing internal IP.

There are plenty of organisations that would happily pay what they'd normally pay Atlassian to use Wiki.js but they can't because they don't want to share any of their own code. This is also why license guides like the one from google explicitly bans all AGPL software because it's not worth the risk.


So it's working as intended then.

It's a bit weird to comment on this as if it's an oversight or unintended downside. Suppose you keep going into someone's house and they don't want you to, so they do something to dissuade you (like putting locks on their doors). You then complain that you can't get in. Their likely response? "Well, yeah..."


These company make changes to Atlassian code? You're conflating internal or public use with derivative work or service offering. You clearly misunderstand licensing.


I'd atleast expect a demo, and the homepage to be running on said wiki.


There's a link to a demo in their README: https://docs.requarks.io/demo


Their demo link there didn't work when I tried it, it's looking for the non-existent master branch. The docs site itself (https://docs.requarks.io/) is running on Wiki.js, though.


I've been using Wiki.js for several months now on a production project (self hosted). It has worked nearly flawlessly for me so far. No complaints. Setup was a piece of cake too.


Just a minor nitpick. If there isn't an actual file called "wiki.js" that is self-contained, I would prefer it be called WikiJS instead of "Wiki.js" to avoid confusion. In general when I see ".js" I expect to see a single file I can import that does something useful to my code.


Can I know what exactly this numbers are "15M+ Installations" On your home page.

I mean thats too large number. Is this of all open source software or I am misunderstanding something else?


I'm assuming that's for every single download, including testing. I know that when I use a service for the first time, I do a few installs while getting used to the software and its configuration. I'd assume they'd have no other way to verify installs (assuming there is no telemetry).


Wikis are also useful for note-taking, I'm using Wiki.js to document a D&D campaign to have some canonical reference of what actually happened in past sessions.


Same here, it's been a great asset to have available to document stuff in my homebrew world and have it reference other stuff. Being able to link my wiki out to my players for their own use is very handy.


Why did you pick Wiki.js for campaign management as opposed to something like Notion?


Big fan of Notion here for both documentation, task management, and general brain dumps. Only worry I have is one day they go away and I have to migrate off how much of a pain that would probably be. It feels locked in, maybe i am wrong


I keep a v2 instance running on my Windows laptop solely for taking notes and keeping need-it-eventually information organized.

It's just enough added structure and functionality to make the whole body of notes more useful, without having to learn a formal system or adopt someone else's idea of what my note hierarchy should look like.


Can we please not have SPA's eat wikis, too? Text-only content does not need... (checking...) 6.3 MB of JavaScript to display (checking...) 3.3 KB of text. Blank pages with JavaScript disabled or in non-mainstream browsers is a really terrible experience for content so plainly simple to display.


Wikis also don't need (or indeed, even permit) cumbersome Git and PR-based workflows just to get changes into the "wiki". Better for it to be a single-page app that actually implements a wiki, than to provide a service that doesn't actually support wikis but has no qualms about throwing the word around anyway.


Is it possible to output static-file-based wiki? (so some static HTML/CSS/js ?)


I've used the 1.x release, and the Mongo requirement was always a bit of a pin. I think v2 fixes that, but I haven't yet upgraded. Anyone has feedback on v1 vs v2?


Looks like they stopped using MongoDB: https://blog.requarks.io/the-switch-to-rdbms/


I was looking into open source knowledge / wiki base solutions recently, and I found https://www.getoutline.com/ to be the most usable.


10 bucks a month though...


For the hosted version.

You can host your own for free: https://github.com/outline/outline


then you have to pay your host


I've been using 2.x since Jan and have really liked it. I'm using it in docker with postgres iirc for a small team for infrastructure documentation. Very markdown friendly and gets the job done while looking nice.


In case anyone was wondering, the dependencies are:

    Node.js 10.12 or later

    MySQL, MariaDB, PostgreSQL, MSSQL or SQLite3
Is it possible to install and run all of these as a non-root user?


Even better, you can run these in Docker as non-root. Security wise, while this wouldn't make your app itself more secure, it would insulate your host OS from getting infected. I just checked and they even have one-liner Docker commands that do just this:

    docker run -d -p 8080:3000 --name wiki --restart unless-stopped -e "DB_TYPE=postgres" -e "DB_HOST=db" -e "DB_PORT=5432" -e "DB_USER=wikijs" -e "DB_PASS=wikijsrocks" -e "DB_NAME=wiki" requarks/wiki:2


Absolutely. For Node, check nvm : https://github.com/nvm-sh/nvm

And choose SQLite.


I think you only need 1 DB, not all of them.


Like the common theme all this wiki and outline are having 3 pane window any for Bootstrap it.


I like the way the documentation is laid out, does that conform to a standard?


Does anyone has a comparison to Bookstack?www.bookstackapp.com


For personal wiki, nothing beats the simplicity of tiddlywiki.


i love this platform, i will suggest this to all my friends and clients :)


Why is there a Linux Tux logo next to macOS?



Slightly related PSA:

Everyone should consider running a wiki locally just for yourself. It's like being able to organize your brain. I just got into it two days ago and basically spent the whole weekend dumping things into it in a way I can actually browse and revisit, like the short stories I'd written, spread out across Notes.app and random folders.

You don't need to run WAMP, MySQL, Apache, phpmyadmin or anything. Here are the steps for someone, like me, who hadn't checked in a while:

0. `$ brew install php` (or equiv for your OS)

1. Download the wiki folder and `cd` into it

2. `$ php -S localhost:3000`

3. Visit http://localhost:3000/install.php in your browser

I tried DokuWiki at first (has flat file db which is cool). It's simpler, but I ended up going with MediaWiki which is more powerful, and aside from Wikipedia using it, I noticed most big wikis I use also use it (https://en.uesp.net/wiki/Main_Page). MediaWiki lets you choose Sqlite as an option, so I have one big wiki/ folder sitting in my Dropbox folder symlinked into my iCloud folder and local fs.

Really changing my life right now. The problem with most apps is that they just become append-only dumping grounds where your only organizational power is to, what, create yet another tag?

My advice is to just look for the text files scattered around your computer and note-taking apps and move them into wiki pages. As you make progress, you will notice natural categories/namespaces emerging.

I just wish I started 10 years ago.


I did start similar things over 10 years ago. Where I am at these days is just text files ( markdown ) nested into folder structures. I've found this the most sustainable for quite a few years and it's been super useful. Main thing is, do whatever, as long as you find it easy to sustain.


This is what finally replaced Google Keep for my shopping list and then eventually everything else. I use Markor and Syncthing on my phone, and a standard text editor on my various computers. It is super nice especially to be able to organize the directory using standard file management tools, search using grep and friends, etc. There is something to be said for simplicity.


+1 for Markor, also available on fdroid


I use markdown too, in a git repo - mainly so I can access them from anywhere, but versioning is occasionally useful too.


Exactly, I wanted something more structured. The filesystem doesn't give me any organizational tools that I want.

My point is to see how easy it is to set up (I used to always equate PHP with having to get a whole WAMP stack online) thus how easy it is to try for yourself.


If you use vscode you should look into the dendron add-on.


> I tried DokuWiki at first (has flat file db which is cool). It's simpler, but I ended up going with MediaWiki which is more powerful, and aside from Wikipedia using it, I noticed most big wikis I use also use it (https://en.uesp.net/wiki/Main_Page). MediaWiki lets you choose Sqlite as an option, so I have one big wiki/ folder sitting in my Dropbox folder symlinked into my iCloud folder and local fs.

or you can just use Zim which is a cross-platform desktop app which does not need any setup and simply save files as text files in markdown : https://zim-wiki.org


There definitely are a lot of options to check out (one of the reasons I went with MediaWiki). I recommend playing with them to get a feel for where they diverge from one another. You may find features in some that you decide are necessary for you that a marketing list of bullet points can't get across.


> I just got into it two days ago

This is the rub. I started a tiddlywiki last year, and stuck with it for several months, but now it has fallen to the wayside as too cumbersome.


This is the problem with a good percentage of advice and life/career hacks on HN -- they're always spoken of in the early, idealist stage. Before the reality has set in. Before the downsides have made their presence known.

I've been in a number of firms with wiki knowledge systems. In 100% of the cases it was a wasteland of derelict knowledge that had been abandoned and was usually much more destructive than beneficial.

No one was going to undertake the process of keeping it up to date, and at the same time the emergent organization/structure of information was constantly evolving, and wikis are terrible at evolving with that unless you literally have people whose sole job is making templates deciding on the ontology, etc.

Similarly, countless people have tried to organize their lives into tools like wiki. And in the early days it seems magical. I suspect the failure rate would be somewhere barely under 100% at the one month mark.


I don't really understand your take-away except for a cynical "bah, nothing ever works so don't try." Or maybe it annoys you that you didn't see some trite platitudinal disclaimer in my post about how how life is all about trade-offs and what works for Bob might work for Alice.

It's like you're about to tell me that exercising doesn't pay off because it's hard to stick with a strategy. "Heh, let's see if he's still doing pushups in a year."

You don't seem to realize you're just describing literally all systems. How organized is everyone's filesystem and ~/Documents folder? It's pure chaos with the only sweet release being that you might not carry it over when you upgrade computers and get to start from scratch.

Will I be maintaining my localhost wiki in a year? I don't know. But it's worth a shot. After two days it's already 1000x organized than even my best efforts so far.

Is it for everyone? Nothing is for everyone.

But your comment seems to suggest that you think the alternative to <organization strategy> is organized data which obviously isn't the case.

What you will realize is that there is no perfect one-size-fits-all strategy. All you can do is try things and see if they work for you, and see if you stick with them years later.

So, for today, I recommend trying some localhost wiki options in your battle against chaos. If it doesn't work for you, so what?


The point isn’t to be cynical and say nothing works, it’s to look at what works over the long term (at least >2 years).

The only thing I’ve ever heard of working on this time scale is plain text files. Maybe with some tool over the top to make it easier to manage than just with an editor, but in the end just plain text.


The skepticism is not unwarranted. Borrowing your exercise example, it would be like saying you do X pushups every day and it changed your life, and then saying you've been doing that for a couple days. Wikis are infamous for not working out on the long term.


I have 10 years experience of editing wikis. I just never thought to start one for myself until two days ago.

If there's something so fundamentally different about running a wiki for yourself on localhost vs a collaboration like Wikipedia or UESP, then why not put some skin into the game and make that point? That sounds like an interesting topic.

I don't even understand the "skepticism". MediaWiki is one of the ubiquitously used platforms in the world via Wikipedia. Nothing about my post hinges on you taking my word, the point was that it will take you a few minutes to get it running yourself, so just try it.

Sure, maybe I came off a little strong by saying that it changed my life. But I have enough life experience to realize when I've encountered something big for me. And being able to organize some of my "lost causes" in a couple days has already made an impact on my daily workflow in a massive way, like for the first time in my life, I feel like I have a grip on my digital existence. I could go into more detail if anyone actually cared the same way I could tell you how moving to Mexico City changed my life after just two days.

Maybe you would walk away from that convo saying that my bar is too low to be using that phrase. Fair enough. But logging into a throwaway to trash it with adolescent glee is a practice in the least charitable interpretation, not someone who wants to have an honest conversation. "Um, there's no way that Mexico City changed your life in just two days" just seems like a nonstarter to me, and rather combative.

If you're skeptical, why not ask how it supposedly changed my life, and we go from there? I glossed over the details of that in my post because, well, that wasn't the point of my post. I just kinda reject this modern attack-dog culture on the internet where you supposedly have to couch everything you say in a front-loaded defense lest someone finds a way to attack you for it instead of probing for more info on it before reaching their conclusion, especially when it's a negative one.

My MediaWiki folder is almost 100gb large and I've been putting a lot of work into regaining control of everything I've built, digitally, in 10 or more years. Yes, it has already changed my life. Though this thread is already far too derailed with walls of text to have that convo here, I think.

Just try it and make your own decision. That was my point from the very first post.



>"I don't really understand your take-away except for a cynical "bah, nothing ever works so don't try." Or maybe it annoys you that you didn't see some trite platitudinal disclaimer in my post about how how life is all about trade-offs and what works for Bob might work for Alice."

The take away is clearly "things that work in the small often don't work in the large. If you're taking life advice, take it from someone who has done something for a while and had a reasonable experience.".

Someone crowing about their two day experience is...well...

This applies to all sorts of similar enthusiastic advice. Intermittent napping. Standing desks. The Dvorak keyboard. Drinking your coffee with butter. New diets. Going without the internet. Meditating. Taking up karate. Working in parks. Whatever. There's a lot of wisdom and knowledge among them, but it isn't coming from the guy who just started.

If someone is giving a sales pitch for a lifestyle change based upon a tiny experience, they are often doing it because they think converting others makes it more real/more likely to yield the change they want. It doesn't work that way.

Try things. Try lots of things. Save evangelizing until you maybe have a real experience?


Ah, so worthless cynicism and condescension after all. I see why you made a novelty account for this. But I wonder if you see the irony in promoting yourself into this role of inquisitor?

I get it, you're mad that I admitted I only have two days of experience while telling people how to get started in one of the most important software platforms in the world (it's how Wikipedia works) -- I'm not exactly making new sounds on this. Maybe you're fine with my install instructions, but I went too far (for your tastes) when I said it was promising so far. And you thought this behavior needed to be called out by your "Dunning-Kruger in the wild" meme account.

I don't think I've misinterpreted the situation, I just find it a bit sad and I wonder how much you think you've added to the discussion. That you run that account, you must think: "quite a bit!" I'll leave that one up to the audience.


>"Ah, so worthless cynicism and condescension after all. I see why you made a novelty account for this. But I wonder if you see the irony in playing this sort of inquisitor?"

You are irrationally hostile. My comments are not for your edification or service, and this is a shared platform where multiple people are reading and deriving value, each comment kicking off different thoughts and conversations. If you take this so incredibly personally, that's a you problem.

Your internalization and repeated attacks are bizarre. But you do you.


I agree with this for the most part.

I've thought about knowledge management a lot over the last 20 years, since I built a Wiki/bug tracker system (this was before anything except Bugzilla existed).

I think knowledge management systems can work if the "management" side is a side effect of their use.


Absolutely, if management is entirely invested in it, and it becomes a mandatory part of the process (e.g. updating the wiki is a part of a release or new build), it can be a critical resource. The problem is that the organic nature of wikis lead to people believing that it will just emerge, in the same way that the relatively unstructured wikipedia eventually became a critical resource.

But citing wikipedia is often folly. The man hour to output ratio of wikipedia is absolutely enormous. It is an extraordinarily inefficient process that works because there are millions of people moving, structuring, contributing, making templates, rewording, reorganizing, etc. Eventually greatness emerged.


I agree. The biggest difference that came to mind when I read your comment is that a personal wiki doesn't have the economy of scale that Wikipedia has. With Wikipedia, the effort of hundreds to organize everything can benefit millions. With a personal wiki, the dynamic is different. I 100% agree that there's a real risk of a 'honeymoon period' giving a false sense of ROI, and I've fallen victim to early enthusiasm about various organizations strategies that ended up not lasting more than a month. I've resorted to a to-do list and a chronological work log as mechanisms that require little organizational investment but yield many of the benefits of a more sophisticated system.


I ran my own personal wiki in like 2004 - 2008. I'm not sure too cumbersome is the word I'd use. I just never found it that useful.

Today though Google Docs, Keep, Notes, Github Gists, github itself, and many other places I can easily store notes and access from anywhere. No reason to setup a wiki and have to maintain it myself.


For me, the problem after 10+ years is how disjointed all of the data becomes and how much data you generate across the web. I have stuff that means something to me scattered across every service and every account.

When you start to plan how to move all of your stuff under one umbrella, the solution starts to sound a lot more like a wiki on paper, I think. Even if you move all this stuff to your filesystem, I think you still need a layer over it to manager it all -- or at least I did.

Of course, it's not the only answer. And I admit I have been contributing to wikis like Wikipedia and UESP for a decade now and the jump to a personal wiki was a no brainer.

But I wonder, what solution would you consider for this "disjointed data" problem? Do you just not see it as a problem? One of the first things I did when I stood up a personal wiki was to log into ancient google accounts to exfiltrate ancient google docs that I'm glad I found again.


I know what you mean. I've made comments over the years to Slashdot, Soylent News, HackerNews, Disqus, WordPress blogs, Wikis, various other special sites, and so on and would like to see that all together and be archived -- especially when sites disappear. It's sad that people made all these centralized web services (often to make money by getting between people and their data) and thus displaced a lot of email instead of people making email better. There is a lot to be said for a local email system like Thunderbird as a knowledge base that goes back for decades. That is true even if email tools could be better if they were more generalized or if they had easier ways to publish stuff from email to the web and ingest stuff from the web back into to the local system. Some related ideas by me from 2015 which I am still working towards on-and-off in my spare time: https://pdfernhout.net/thunderbirds-are-grow-manifesto.html

My latest experiment (in Mithril/HyperScript/Tachyons/Node.js) integrates a file browser, markdown viewer and editor, and email viewer (although it is all still very rough): https://github.com/pdfernhout/Twirlip15

But ultimately what we probably need more than tools are simple and popular standards for encoding information that can be linked together. Email (in MIME format) is one such standard but it is fairly complex. Maybe a JSON schema or RDF schema for linked information might help with that. Or something like tags or RDF triples embedded in Markdown -- something I started playing with in Twirlip15 (inspired in part by Foam).

Code for a foam-like "Ideas" app using markdown, triple parsing, and Cytoscape: https://github.com/pdfernhout/Twirlip15/blob/aa75ed1be5dc4a7...

And one example file: https://github.com/pdfernhout/Twirlip15/blob/792b067c30c7846...


Btw, TiddlyWiki is very, very simplistic and limited. I've used it for years for throwaway note-taking yet have a hard time equating it to the other "full" wikis.

It just became an append-only log for me with very limited organizational power. Though I do like it for anything just long enough where a single .txt file doesn't cut it. Tiddly is great for that case because it encapsulates the common task of jumping between the same sections over and over -- the real downside of a large file. But you aren't alone in finding it's not so great on a larger scale.

So if you did like the idea of a wiki but weren't diddly with the Tiddly, might be worthwhile to check out something like DokuWiki or MediaWiki.


https://tiddlyroam.org/ is another alternative, but Tiddly* kind of feels like a dead end. The big upside is the lack of dependencies (just a file), but when the wiki grows you soon have a 70 MB html file that kills your browser. I know there's the node.js version too, but then I might as well use MediaWiki (even more portable).


I'm getting back into the Zettelkasten note-taking technique which is like a tech-agnostic wiki system (it was originally implemented using physical cards). I originally tried a personal wiki but it was just more tech overhead than I cared to deal with. Plain markdown files with zettelkasten are doing it for me now. Zettlr adds some nice features like adding an easy ability to tag files (just hashtags in the markdown) and search files by tags. It all feels more freeform/lightweight than a wiki server.


I've been running wikis the last 20 years, but I too use static files for note taking, a wiki is not for everything IMHO. Zettelkasten just feel like a low tech wiki to me. Is it really that great or is it just the simplicity that makes you a convert?


I admittedly have not been using it long enough for it to come into its own (when the connections should start to do more heavy lifting). But my impression so far is that the "low tech wiki" description is essentially accurate, although since I keep all my files in one folder, I find it easier to get an overview of what topics I've already covered. I guess there's probably a page for this in most wiki systems as well, but I'm not a wiki power-user.

The reason I'm a convert is that it seems like the best of both worlds between raw note-taking and a wiki. The advantage over raw note-taking is the links that enable you to "crawl" its entirety. The advantage over a wiki is that it's tech-agnostic and you can do it however works best for you.

On the other hand, a wiki may be better for someone who wanted to embed media in their notes (such as audio recordings).


I started writing a master thesis in a MediaWiki a long time ago. Did not work for me, maybe lack of keeping a proper index. It was a data graveyard. On the other hand, so is my whole home folder...


I've been using https://www.zettlr.com/ for a while and can't live without it.

I created a mashup of Zettelkasten + bullet journaling + a linking system based on tagging and IDs that models the fact that knowledge is both hierarchical and associative - i.e. fractal.


Just to throw my hat in the ring here:

My co-founder and I have been building a hosted version of this[1] for the last two years, because we recognized that while self-hosted wikis work great for techie people, there are a lot of other people who that label doesn't fit.

So we've been working to create a collaborative knowledge-base platform built around some key concepts:

1. Built around cards rather than documents, which allows for a lot of interesting and flexible features. Such as...

2. Granular sharing – on Supernotes, you can share an entire collection of cards, or you can share one card at a time. We also have recently introduced[2] a "friends" features that allows you to quickly drag-and-drop cards onto your friends to share with them.

3. Multi-parent nesting – there is no folder-style filesystem on Supernotes, we allow you to nest cards inside of each other. On top of this, we allow for this nesting to be multi-parent, so different users can fit the same cards into their own unique structure (effectively a collaborative / personalized version of symbolic links).

4. Public vs. private tags – cards can be tagged with public tags that everyone sees, but can also be tagged privately with only tags that you can see. This same idea is reflected across the platform, where we want the underlying content to be the same for everyone but want to allow users to personalize the metadata/structure to suit their own workflow.

5. Focus on speed – we have spent a lot of time making Supernotes speedy quick, and try to make it faster every time we release a new feature.

Anyway that is the rough idea. The goal of Supernotes is to be a sort of data-layer where you can keep all these compartmentalized pieces of content (as cards) and then mix-and-match at will to create very simple or very complex stores of knowledge. We also want you to be able to embed these pieces of content elsewhere (say in a Notion document or on your blog) with as little effort as possible (not quite there yet, but will be soon).

[1] https://supernotes.app/

[2] https://supernotes.app/changelog


I like the points you mention here - I took a look at your site and glad to see pricing is easy to find. I myself want to self host, as I don't trust the cloud - but if I was to try your version of this thing, I would lean into the $300 one time payment, as the free tier would be worth the time (20 notes) - and I hate monthly payments and non-owned code.

I did scan of the faq and ended up up on the docs and searched for export. I was pleasantly impressed with the entry which showed an export option along with text and videos showing how to do this.

For me, I probably won't be spending 300 on this when I can wiki or wordpress for free... but if I was not so jaded about saas and cloud, I would be persuaded to check out your thing if you had on the front page like "export, backup" and bonus if it was 'import/export markdown or similar files'

I'd feel less worried about vendor lockin, holding data hostage, what happens when you go bankrupt, etc

the heading fonts on your privacy page are a little wonky in my browser (firefox) - being that you are uk and sharing data with EU and outside the EU - I'd only save info if it was encrypted.. not sure if that is a thing, if so I would make 'privacy built in' a big thing on the front page.

my cents in trying to help, I'm sure 98% of those who may use your service are not as sensitive to the same things I am - so this is not a critique saying it's bad, just offering some random thoughts as I took a look.


Thanks for taking the time to write down your thoughts!

Data ownership is pretty important to us, even though we are only offering a hosted solution, which is why we explicitly say as much in our T&Cs[1]. But yep, we want to make export / backup of your data as easy as humanly possible. The hard part generally is that there are a number of features that exist on Supernotes which just don't exist elsewhere, so even when you use the export feature it is hard to guarantee we can export it in a format that is useful to you.

That is part of the reason we are doing our best to openly document our API[2] so that you can interact with your own content in whatever way you wish (including importing content from wherever or exporting to wherever). Obviously this requires some coding, but we're hoping the community[3] will share any tools they build on top of the API with each other.

Unfortunately E2EE is not quite there yet, as it makes it much more difficult to facilitate sharing when you have E2EE, as well it being a bit of a problem when it comes to a knowledge base if a user loses their private keys and you have to tell them "sorry we can't get your content back – it's all gone". But this is definitely something we are working towards – just takes some time to nail the UX. Since we are definitely never going to sell your data or anything (as per T&Cs), it's better for us if it's E2EE as then it's just one less liability for us from a data protection perspective.

[1] https://supernotes.app/terms/

[2] https://api.supernotes.app/docs/

[3] https://community.supernotes.app/


I've been using Obsidian for note taking recently, and as much as I really enjoy it - having hypermedia would be quite useful too. I can paste images in Obsidian, but it tends to put the pasted image in the root folder, and I can't display it inline with my notes.

EDIT: VisualEditor, the de facto standard for pasting things like screenshots into your articles seems to be a pain to install. Got my local env up and running though.. Will report back on success with this extension.


I may have misunderstood, but I'm using images in Obsidian without any trouble. Create a folder inside Obsidian, right click it, choose "Use as attachment folder" and then drag and drop / paste images into your documents however you like. It saves to this folder and automatically generates the markup when you put an image in.


TIL. Just tested this out with JPG's, PDFs, text files and screenshots..

Given the complexity of setting up WikiMedia properly, I think I'm going to keep using Obsidian.


I mentioned it elsewhere[1], but conveniently VisualEditor should be getting easier to install in an upcoming Mediawiki release (probably 1.35, which is coming next), because of a bunch of parser centralization work. Fingers crossed, but the hassle that you just went through shouldn't be needed soon.

[1]: https://news.ycombinator.com/item?id=23907291


For those of you looking for an update - I gave up - the steps for installing the Extension were long and confusing on Mac, then found out Obsidian can use different folders for inline items. I wish you could preview it within the page, but that's OK.


I use Zim for this, backed by Dropbox. It's just text files, and Zim is just an editor, not a server or anything like that.

If I ever tire of keeping a personal wiki for whatever reason, all of the content I've built up in it will remain organized as files within directories.


Came here to say this. I've been using Zim for a little over 2 years, now, and it's become an every-day planning/journaling/notebook/project habit. Lightweight and doesn't get in your face with complexities or demands.

I highly recommend the 'Backlinks' plugin to improve the wiki functionality; leaves Roam standing in the dust for personal use.


+1 for Zim+Syncthing, been using it for almost 2 years now. The only gripe I might have is against the near useless search and proper support for code blocks.


I almost went with MediaWiki, but ended up with DokuWiki. The fresh install of MediaWiki is 154 MB (!) and it's not exactly lightweight. DokuWiki is 10.9 MB and all content is saved in plain text files. Very attractive.

However, backlinks are not possible without hacks. A wiki without backlinks is kind of lame and I could very well use my good old plain text files.

Have you run in to trouble when updating MediaWiki, or is it smooth sailing? SQLite is not mentioned here: https://www.mediawiki.org/wiki/Download


What do you mean by backlinks are not possible? Do you mean in DokuWiki? Each page has a "What links here" button that shows backlinks - that feature is builtin.


Also wanted to get into creating a local 'wiki' or knowledge base. Sadly I didn't hit the sweet spot till now. My requirements are:

- future proof (at least not only a one man project) - Fast search over all informations - Fast creation of quick notes (inbox) - Mobile iOS client

Currently I am stuck with Notion, which has a great 'database' concept. Which is fun to use. Sadly it's too slow. If I want to take a quick note on the go "Google for M6x40 Screws" I need 10-20 seconds with Notion.

I don't even mind paying for such service...


MediaWiki is certainly future proof and has lots of inertia, including a large API surface area, a large ecosystem, and the backing of a deep-pocketed benefactor. An extremely underrated/largely unknown project is Miraheze[0] which is a non-profit that provides free MediaWiki instances to people.

I wish proper wikis hadn't gone by the wayside. (I think it has a lot to do with MediaWiki's default skin being out of style, and people not realizing they can change it.) Most of all, I wish open source projects would stop dumping a bunch of Markdown in a repo somewhere and calling it a "wiki". They're not even close to comparable.

My two biggest complaints about MediawWiki are 1. PHP, and 2. no well-supported way to opt-in to a different syntax like Markdown or AsciiDoc or pretty much anything that isn't MediaWiki-flavored wikitext.

0. https://miraheze.org/


Which "wiki folder" do you refer to? Are you talking about Wiki.JS, or some other wiki engine? I see you mention MediaWiki, are your simple steps for it?


When you click to download DokuWiki, MediaWiki, etc. you are downloading a folder that contains the application as well as an install.php script.

My steps work for all 5 of the wikis I tried before I settled on MediaWiki (though I don't necessarily recommend it to everyone). The install.php script might be in some subfolder, but the website instructions will tell you.

Neither DokuWiki nor MediaWiki (via sqlite) needed to have an external DB running, though some wikis do depend on MySQL.

It was just a quick summary to show how easy it is. e.g. PHP has an embedded server these days.


I'm building my own wiki engine for my website. For now it only turns [words] into

  <a href="words">[words]</a>
meaning out-of the box support for arbitrary hrefs:

  [/absolute_wiki_links]
  [relative_wiki_links]
  [https://external.links]
  [mailto:email@adress.es]
and more!


Vimwiki also works very well.


If you're going to run it locally why not just use Apple Notes app?

If using a web app, it would be better to run it on a $5 server, so if you want to type in something while you're outside with just your phone, you can do that also.


I've been using Notes.app (and every other note-taking app) for years. It has zero organizational power. I don't think anyone would use it if it weren't for the fact that it comes on macOS/iOS and that it's synced by default.

I have thousands of notes in Notes.app across every subject. And moving them into my wiki (categorizing them, linking them) was one of the first thing I did. And one of the best things I've done. Like I had all sorts of stuff in there: stories I've written, lists of things, texts me father sent me, 4 different documents where I had written down birthday/xmas ideas for my girlfriend that I never remembered to check.

These mapped very nicely to pages and categories on my wiki. I even have a page for my girlfriend (globally available on my sidebar) that now has a === Gift ideas === subheader.

One day you just might decide Notes.app is not cutting it for you and that you want better organization. Maybe you won't. I'm in my 30s and didn't do it til now.

I have mine running from Dropbox, so my other computers always have it synced. The real issue is mobile access. It's not something I care about right now but making it internet addressable is certainly something I could do in the future.


I started using Wiki.js over a year ago to maintain documentation related to system admin duties.

We run this in a docker container with SQLite database and backup the database daily to another server.

The private and public pages feature fits perfectly to our use case. We show system information, how-to guides and rules on the public pages and manage sysadmin documentation with restricted access.


If it doesn't run on MediaWiki, I'm not into it.


DokuWiki used to be very much better for my use cases: easier to hack on/make plugins for and more built in functionality and less dependencies on top if that since it store the pages as flat files instead of using a DB.


Hmm. I'll go check it out.

The main thing I'm worried about with other wiki software (including Wiki.js) is that if it's compatible with gadgets, userscripts and all of the other neat tools already available.

It doesn't have to be MediaWiki, or even a distant relative of it. It just has to work with them.


Why?


I mean, I appreciate the effort put into building this, but external tools like AWB, and userscripts/gadgets (plus a host of other goodies) can't be accessed over globally if it's on a completely different software. Almost every wiki on the net uses MediaWiki for good reason.

I will be happy if this Wiki.js platform does have compatibility with these features, though.


I guess the same reasoning would apply to WordPress, PHP and MySQL then (user scripts, penetration etc). Would that be your first choice when setting up a new web page?


> Almost every wiki on the net uses MediaWiki for good reason.

It is unclear to ne if you refer to statistics or gut feeling here. Would you mind clarifying?


By 'MediaWiki' I'm referring to the wikis and wiki farms that use MediaWiki or a variant of it. This includes:

- Wikimedia

- FANDOM

- Gamepedia

- Miraheze

FANDOM is the most massive wiki farm with over 360,000+ (as of 2016) wikis[1], which I'd give at lowest an estimate of 60% of the total number of wikis on the net, and is 88th on the Alexa rankings.[2] FANDOM is a wiki powerhouse, and you bet it uses MediaWiki.

Excluding WikiHow, I have never seen a wiki not use MediaWiki. As one of the guys that hops across many different wikis and wiki farms doing automated work, I cannot stress this enough.

[1] Brandon Rhea, FANDOM VP of Growth https://community.fandom.com/wiki/Choosing_Fandom?diff=next&... (dated June 14, 2016)

[2] Alexa.com https://www.alexa.com/siteinfo/fandom.com#section_traffic (dated ~21 July 2020)


> Excluding WikiHow

WikiHow is using mediawiki (or at least a fork of it) https://src.wikihow.com/

But there certainly exist other wikis and wiki-like projects that dont. MDN and OWASP wiki are prominent examples that moved away from mediawiki. I think mediawiki has most of the mass-collabotation market, but there is much more competition in the open-source project documentation niche (which people often use wikis for) and corporate knowledge base market.

P.s. for the interested, mediawiki has statistics at https://pingback.wmflabs.org/#unique-wiki-count (opt-in) and https://wikiapiary.com/wiki/Main_Page (based on web crawling)


Gamepedia was bought by Fandom too, FYI.


They're apparently working hard to upgrade and merge their diverging MediaWiki codebases; some details on this very interesting blogpost: https://community.fandom.com/wiki/User_blog:MisterWoodhouse/...


Neat, I actually sold a wiki to Curse pre-Gamepedia and worked there for a short while. Not very happy to see a single company gobble up so much of the online gaming community.


Stats, clearly.


Kontxt (https://kontxt.io) could be a perfect inline communication and engagement layer to enhance wikis and docs with inline highlights, comments, polls, @mentions, page navigation, shareable deep links, and permission-based sharing.


Did... did kontxt.io write this?

Oh wait yeah.


Hello emiliovesprini! You are correct. That's actually why the username "kontxt" was specifically selected. Decided to share here because people exploring the wiki and documentation space might find it useful. Best regards fellow code creator.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: