It seems like they actually support it in Wiki.js however it requires you to first click "insert assets" and once that modal displays , you can actually paste into the page and it will be uploaded.
Not too different than JIRA really. But I feel like this feature would be improved if pasting into the editor itself yielded the same result.
"You can upload images from a tab in the media dialog, or by dragging and dropping a file into the editor, or by pasting an image from your clipboard. [...] The image will be inserted into the page when you are done."
That said, this is about to change! The upcoming Mediawiki 1.35 is supposed to move Parsoid into core PHP, and so VisualEditor is going to become a lot more default-accessible. :D
I have been waiting for an online wiki with the usability of Apple Notes that I use locally on all my Apple devices. It works like a charm except that I cannot make it public.
This is why I often take notes rather than write blog posts on my website. If the wiki software was as easy drag and drop as Apple Notes, I’d just take notes and they turn into publicly available wikis!
I am yet to find that tool. I would happily pay for such a tool with one braking condition that it must be self-hostable. I will not write my content into something like Medium or Notion where I don’t own my content.
Is this the type of functionality you're looking for if it could be self-hosted? How much would you be willing to pay for such a tool?
- ui: joplin.
- agpl, fully self-hostable.
- you own your content (because joplin).
- choose among free templates, or create your own.
- templates will be similar or compatible to hugo, still tbd.
Optional for paying customers:
- sync via webdav to my service.
- custom domain.
- backups, etc.
Once this starts generating money, I am planning to spend some of it to fund e2e per-folder encryption in joplin.
* CKEditor: https://extensions.xwiki.org/xwiki/bin/view/Extension/CKEdit...
* Syntax: https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide...
Actually any wiki pages can define a Class and how to display this class and / or instances of a Class:
The Script Macro is useful to make some dashboards ( https://extensions.xwiki.org/xwiki/bin/view/Extension/Script... )
I've deployed this for the internal documentation inside a company I worked for (MediaWiki was a no-go even with a visual editor).
For each new feature, I was developing inside a clean new wiki, than I exported the changes once I was sure everything was okay. It is way more easy then to upgrade to the new XWiki version.
From the text of Outline's license :
The Business Source License (this document, or the “License”) is not an Open
Source license. However, the Licensed Work will eventually be made available
under an Open Source License, as stated in this License.
Since I just got into MediaWiki and write my first extension (finally a dark-mode that works), I'll see if this can be implemented. Perhaps with https://www.mediawiki.org/wiki/API:Upload.
Mediawiki has some UX and RBAC challenges that makes it difficult to scale to large organizations.
Google has some motivations written down from their lawyer department: https://opensource.google/docs/using/agpl-policy/
It boils down to 'not worth the risk, do not use'.
On the contrary, licenses gives permission to those who do not hold copyright. Without any license, only the copyright holder has any rights to copy or modify the work in any way (except for fair use). A license gives permissions to non-copyright holders to do things which would otherwise be illegal under copyright law. A license can never restrict what anyone would otherwise be allowed to do, since it is not a contract.
Simply hosting it with your information in would not have any such effect, but many commercial entities avoid anything xGPL just-in-case. In this case perhaps because they see a time that they might later want to package and distribute documentation that is in the wiki without converting it to something else first.
There is an extra concern with AGPL that does not exist with GPL specifically because of its key difference. AGPL applies to hosting the software and making it available not just distributing a compiled form. Some interpret this as meaning that if it is hosted on the same server, or in the same site, as other software then that other software becomes AGPL licensed too. I doubt anyone would enforce this interpretation but the possibility is enough to put off those who create proprietary software.
> making it accessible to the public that the copyleft license would then apply to my proprietary software?
Not just the public. Anyone you give access to, so for non-public hosted proprietary software you could be beholden to giving them access to the code under the AGPL in situations where AGPL applies. This will be a complete blocked for many creators of proprietary, or other non-*GPL licensed, software.
[if the above makes me sound against AGPL rest assured that I am not - I in fact might end up using it at least initially (at least until I decide upon which of the more proprietary-friendly options to use) for some near-future projects]
If I hosted a wiki.js, say on a subdomain, unrelated to a commercial application's code at all, but lined to it (e.g. the "docs" section of navigation) would I need to disclose
a) the source code of the wiki.js subdomain only, and any modifications we make to it
b) the source code of wiki.js and the other proprietary app BOTH
If you allow self-hosting of your proprietary code by your users and the refers to your documentation server than this would still not trigger AGPL for the other code - you are not distributing wiki.js or parts of it. If you allow self hosting of the other code and include wiki.js for a local copy of your documentation then APGL might trigger.
Linking is meaning software linking, not referencing your own content displayed by the software. Think of it as the same difference between including using chunks of a paper in your own paper, or saying "as discussed in This Paper About This Thing (Him, Her, et. al, 2013)...". Or, as in this case the content is your's, referencing your own paper.
This potential confusion with the word "linking" (as natural languages are fairly dynamic beasts) is why people are sometimes overly fearful of AGPL and GPL (and often even LGPL) software.
Problem is that unless you are installing your servers by hand you end up re-packaging wiki.js and then your own packaging code and the modification on wiki.js as a build artefact are now supposed to be public as well.
Do you mean to distinguish between setting up an EC2 instance or digitalocean droplet type of thing versus using ansible/chef/etc orchestration among the servers? Implying you would have to publish the configs for all your servers?
That would make no sense, both from a practical perspective (publicly disclose server configs? those can include sensitive information), and a theoretical perspective (what does that have to do with the source code at all?).
Interestingly Amazon just finished a multi-year effort to migrate off MediaWiki internally to comply with an infosec mandate that PHP is banned company wide.
To actually set someone else's password to a specific value does require running a command line script (not the same as going into db). In my view that is a reasonable security-convinence trade off.
In any case, i would assume a large org would use a single-sign-on extension and not mediawiki's native user management, which would make MW's password management moot.
Disclaimer: am mediawiki developer
- easy to use for technical and non-technical staff alike: multiple editing options
- third party authentication: really comprehensive offering
- quality search: comprehensive internal and third party search offering
- ease of maintenance: largely everything is built-in, so no module/dependency maintenance headaches
- user management: solid user/group management system
With internal tools you need things to stick, and fast. As much as I am fond of mediawiki, the editing experience is a barrier to usage for many. And the extension ecosystem, while rich and diverse, is just more of a liability than a single installation. A quality search is also really important to adoption, so having options there is great.
I'd been using Docsify on a small scale with authentication through GitLab to edit, GitLab CD to build and Cloudflare Access to secure the front end. It works really well, but the lack of user management and the editing experience mean that it's time to move on.
It would be great to hear if this is a case of the grass always being greener on the other side.
Who exactly is asking for slower software?
I'm sorry but it's right there in the name
As much as we could argue about whether no-js support really matters or not in 2020, the fact remains that having to having to load Vue.js and have it parse and render the frontend on the client is not really "lightweight", especially when the most popular competing products pre-render on the backend.
It wouldn't honestly be that much of an issue if it was a SPA (and it must do SPA-ish things already if it uses Apollo) so you just load the frontend once and it'd load other pages asynchronously, but nope. Every link is a full page reload, with Vue having to re-do everything every time.
It just reeks of modern tech used in an old fashioned way, which ends up with the performance penalties from both.
Go browse any other wiki, see how much faster and smoother the experience is.
I would say using js without having a no-js version is ok, if done correctly.
Please ensure that if you mention your service or system on HN it works over telnet and outputs plain ascii.
Also noticed that it feels slow page to page, thinking that it might be an issue with my Firefox configuration I opened the page on a fresh profile and it's still slow, but if you open it in Chrome page to page becomes almost instant and is more comparable to MediaWiki. So maybe this particular performance issue on FF can be resolved but it does seem like a worse end user experience when compared to MediaWiki.
Is Node.js that blazing-fast ?
It is based on V8 which in many (caveat: far from all) benchmarks comes out as the fastest JS engine, so by that definition in the realm of JS powered components it is pretty quick.
node.js solutions will often perform better than common configurations of other options too - Apache+PHP to pick one example out of the air. Then again, depending on the code other configurations of PHP might outperform Node.
This gives us plan text files that are tracked in a repo. It uses the user as the author, so now I can "code review" edit's to our wiki.
The content of the wiki is easily cloned by cloning the git repo. It is markdown in folders so if wiki.js dies at some point I could write a pandoc script to turn it into web pages again, you do loose all of the cool UI features.
Now that git has become ubiquitous, I prefer git with a self-hosted git-daemon instance. git , grep , awk , and sqlite make a strong set of tools for knowledge curation.
edit: minor grammar fix
That said, DocuWiki is pretty decent to get up and running quickly.
"footgun" is an excellent term. Here's an earlier use on HN, 2010:
I understand that bad code can be expressed in every language. But there is bad tooling too.
PHP clearly has a lucrative place in the world. But it remains a significant threat vector.
Yes, even in 202x. I leave others to discuss why this is the case. I won't install PHP on a workstation just to run a Wiki. ^_^
With automation you'd build images based on their images but run via your own CI/CD with your own security scans and any additions you might need (like additional logging infrastructure). Doing that is not possible with AGPL.
I guess, to a certain extent, that's because I'm an individual, not a company, and one that tends to open source pretty much everything they write. This is the same licensing that I use for pretty much all my projects (AGPL with no CLA).
What are you talking about? They can change the license to a closed one from a certain version in the future.
You're right if and only if by "they" you mean every copyright holder whose contributions would exist in the future version (including, say, the contributions of the very person you're responding to). But if by "they" you mean the project leaders acting without the cooperation of everyone who holds copyright, then that's a no.
Main guy commiting 600k lines and the second most committed guy 450 lines.
So yeah, it wouldn't take him a whole lot of time if he really wanted to change the license by removing all the others' commits and rewriting it by himself.
Also, what does AGPL has anything to do with keeping the license open sourced?
> So yeah, it wouldn't take him a whole lot of time if he really wanted to change the license by removing all the others' commits and rewriting it by himself.
Good point. I may try to get involved in its development as well to spread that out a little more.
> Also, what does AGPL has anything to do with keeping the license open sourced?
The AGPL requires that any code linked to it also be distributed with an AGPL license (the difference from the GPL being that hosting over a network counts as distribution). Every part of the project is technically linked to itself so if someone makes a change, no one else can use that change in the project under a different license.
That's not the scenario gary-kim laid out; you've failed to satisfy the constraints in the premise.
I know who you're replying to. The premise that gary-kim laid out is still the relevant context. The hypothetical you're laying out, on the other hand, is not relevant, it's at odds with that premise (not in "agree[ment] with him"), and it's derailing the thread. (Which is the same reason your "future versions, not past versions" is downvoted, for that matter.)
> I don't think _any_ of the mainstream open-source licenses allow you to retroactively revoke or change the license.
My reply is in this context, not in the parent's parent which you mean as _relevant_ context. If I wanted to include the parent context the reply would be more specific. This was a direct reply to a specific message. This is a very normal way to reply on the internet, HN is not special.
> My reply is in this context, not in the parent's parent
The parent's parent at that point is... your comment, "For future versions, not past versions," which was off-topic.
> This is a very normal way to reply on the internet
Indeed, it's common for people to lose the plot in the comments section and then get defensive (and smug) while being wrong, e.g.:
When have I suggested the project change the license???
I only said it is possible.
None of that is impossible with AGPL.
So what? If companies need a certain software, they can pay for it. I remember a time when FOSS was not about providing companies with free work, quite the opposite indeed.
This isn't about good/bad or something like that, just an odd presentation that doesn't seem to be in line with the license. There is nobody to pay here to use this stuff because you still won't be able to integrate it without also sharing internal IP.
There are plenty of organisations that would happily pay what they'd normally pay Atlassian to use Wiki.js but they can't because they don't want to share any of their own code. This is also why license guides like the one from google explicitly bans all AGPL software because it's not worth the risk.
It's a bit weird to comment on this as if it's an oversight or unintended downside. Suppose you keep going into someone's house and they don't want you to, so they do something to dissuade you (like putting locks on their doors). You then complain that you can't get in. Their likely response? "Well, yeah..."
I mean thats too large number. Is this of all open source software or I am misunderstanding something else?
It's just enough added structure and functionality to make the whole body of notes more useful, without having to learn a formal system or adopt someone else's idea of what my note hierarchy should look like.
You can host your own for free: https://github.com/outline/outline
Node.js 10.12 or later
MySQL, MariaDB, PostgreSQL, MSSQL or SQLite3
docker run -d -p 8080:3000 --name wiki --restart unless-stopped -e "DB_TYPE=postgres" -e "DB_HOST=db" -e "DB_PORT=5432" -e "DB_USER=wikijs" -e "DB_PASS=wikijsrocks" -e "DB_NAME=wiki" requarks/wiki:2
And choose SQLite.
Everyone should consider running a wiki locally just for yourself. It's like being able to organize your brain. I just got into it two days ago and basically spent the whole weekend dumping things into it in a way I can actually browse and revisit, like the short stories I'd written, spread out across Notes.app and random folders.
You don't need to run WAMP, MySQL, Apache, phpmyadmin or anything. Here are the steps for someone, like me, who hadn't checked in a while:
0. `$ brew install php` (or equiv for your OS)
1. Download the wiki folder and `cd` into it
2. `$ php -S localhost:3000`
3. Visit http://localhost:3000/install.php in your browser
I tried DokuWiki at first (has flat file db which is cool). It's simpler, but I ended up going with MediaWiki which is more powerful, and aside from Wikipedia using it, I noticed most big wikis I use also use it (https://en.uesp.net/wiki/Main_Page). MediaWiki lets you choose Sqlite as an option, so I have one big wiki/ folder sitting in my Dropbox folder symlinked into my iCloud folder and local fs.
Really changing my life right now. The problem with most apps is that they just become append-only dumping grounds where your only organizational power is to, what, create yet another tag?
My advice is to just look for the text files scattered around your computer and note-taking apps and move them into wiki pages. As you make progress, you will notice natural categories/namespaces emerging.
I just wish I started 10 years ago.
My point is to see how easy it is to set up (I used to always equate PHP with having to get a whole WAMP stack online) thus how easy it is to try for yourself.
or you can just use Zim which is a cross-platform desktop app which does not need any setup and simply save files as text files in markdown : https://zim-wiki.org
This is the rub. I started a tiddlywiki last year, and stuck with it for several months, but now it has fallen to the wayside as too cumbersome.
I've been in a number of firms with wiki knowledge systems. In 100% of the cases it was a wasteland of derelict knowledge that had been abandoned and was usually much more destructive than beneficial.
No one was going to undertake the process of keeping it up to date, and at the same time the emergent organization/structure of information was constantly evolving, and wikis are terrible at evolving with that unless you literally have people whose sole job is making templates deciding on the ontology, etc.
Similarly, countless people have tried to organize their lives into tools like wiki. And in the early days it seems magical. I suspect the failure rate would be somewhere barely under 100% at the one month mark.
It's like you're about to tell me that exercising doesn't pay off because it's hard to stick with a strategy. "Heh, let's see if he's still doing pushups in a year."
You don't seem to realize you're just describing literally all systems. How organized is everyone's filesystem and ~/Documents folder? It's pure chaos with the only sweet release being that you might not carry it over when you upgrade computers and get to start from scratch.
Will I be maintaining my localhost wiki in a year? I don't know. But it's worth a shot. After two days it's already 1000x organized than even my best efforts so far.
Is it for everyone? Nothing is for everyone.
But your comment seems to suggest that you think the alternative to <organization strategy> is organized data which obviously isn't the case.
What you will realize is that there is no perfect one-size-fits-all strategy. All you can do is try things and see if they work for you, and see if you stick with them years later.
So, for today, I recommend trying some localhost wiki options in your battle against chaos. If it doesn't work for you, so what?
The only thing I’ve ever heard of working on this time scale is plain text files. Maybe with some tool over the top to make it easier to manage than just with an editor, but in the end just plain text.
If there's something so fundamentally different about running a wiki for yourself on localhost vs a collaboration like Wikipedia or UESP, then why not put some skin into the game and make that point? That sounds like an interesting topic.
I don't even understand the "skepticism". MediaWiki is one of the ubiquitously used platforms in the world via Wikipedia. Nothing about my post hinges on you taking my word, the point was that it will take you a few minutes to get it running yourself, so just try it.
Sure, maybe I came off a little strong by saying that it changed my life. But I have enough life experience to realize when I've encountered something big for me. And being able to organize some of my "lost causes" in a couple days has already made an impact on my daily workflow in a massive way, like for the first time in my life, I feel like I have a grip on my digital existence. I could go into more detail if anyone actually cared the same way I could tell you how moving to Mexico City changed my life after just two days.
Maybe you would walk away from that convo saying that my bar is too low to be using that phrase. Fair enough. But logging into a throwaway to trash it with adolescent glee is a practice in the least charitable interpretation, not someone who wants to have an honest conversation. "Um, there's no way that Mexico City changed your life in just two days" just seems like a nonstarter to me, and rather combative.
If you're skeptical, why not ask how it supposedly changed my life, and we go from there? I glossed over the details of that in my post because, well, that wasn't the point of my post. I just kinda reject this modern attack-dog culture on the internet where you supposedly have to couch everything you say in a front-loaded defense lest someone finds a way to attack you for it instead of probing for more info on it before reaching their conclusion, especially when it's a negative one.
My MediaWiki folder is almost 100gb large and I've been putting a lot of work into regaining control of everything I've built, digitally, in 10 or more years. Yes, it has already changed my life. Though this thread is already far too derailed with walls of text to have that convo here, I think.
Just try it and make your own decision. That was my point from the very first post.
The take away is clearly "things that work in the small often don't work in the large. If you're taking life advice, take it from someone who has done something for a while and had a reasonable experience.".
Someone crowing about their two day experience is...well...
This applies to all sorts of similar enthusiastic advice. Intermittent napping. Standing desks. The Dvorak keyboard. Drinking your coffee with butter. New diets. Going without the internet. Meditating. Taking up karate. Working in parks. Whatever. There's a lot of wisdom and knowledge among them, but it isn't coming from the guy who just started.
If someone is giving a sales pitch for a lifestyle change based upon a tiny experience, they are often doing it because they think converting others makes it more real/more likely to yield the change they want. It doesn't work that way.
Try things. Try lots of things. Save evangelizing until you maybe have a real experience?
I get it, you're mad that I admitted I only have two days of experience while telling people how to get started in one of the most important software platforms in the world (it's how Wikipedia works) -- I'm not exactly making new sounds on this. Maybe you're fine with my install instructions, but I went too far (for your tastes) when I said it was promising so far. And you thought this behavior needed to be called out by your "Dunning-Kruger in the wild" meme account.
I don't think I've misinterpreted the situation, I just find it a bit sad and I wonder how much you think you've added to the discussion. That you run that account, you must think: "quite a bit!" I'll leave that one up to the audience.
You are irrationally hostile. My comments are not for your edification or service, and this is a shared platform where multiple people are reading and deriving value, each comment kicking off different thoughts and conversations. If you take this so incredibly personally, that's a you problem.
Your internalization and repeated attacks are bizarre. But you do you.
I've thought about knowledge management a lot over the last 20 years, since I built a Wiki/bug tracker system (this was before anything except Bugzilla existed).
I think knowledge management systems can work if the "management" side is a side effect of their use.
But citing wikipedia is often folly. The man hour to output ratio of wikipedia is absolutely enormous. It is an extraordinarily inefficient process that works because there are millions of people moving, structuring, contributing, making templates, rewording, reorganizing, etc. Eventually greatness emerged.
Today though Google Docs, Keep, Notes, Github Gists, github itself, and many other places I can easily store notes and access from anywhere. No reason to setup a wiki and have to maintain it myself.
When you start to plan how to move all of your stuff under one umbrella, the solution starts to sound a lot more like a wiki on paper, I think. Even if you move all this stuff to your filesystem, I think you still need a layer over it to manager it all -- or at least I did.
Of course, it's not the only answer. And I admit I have been contributing to wikis like Wikipedia and UESP for a decade now and the jump to a personal wiki was a no brainer.
But I wonder, what solution would you consider for this "disjointed data" problem? Do you just not see it as a problem? One of the first things I did when I stood up a personal wiki was to log into ancient google accounts to exfiltrate ancient google docs that I'm glad I found again.
My latest experiment (in Mithril/HyperScript/Tachyons/Node.js) integrates a file browser, markdown viewer and editor, and email viewer (although it is all still very rough): https://github.com/pdfernhout/Twirlip15
But ultimately what we probably need more than tools are simple and popular standards for encoding information that can be linked together. Email (in MIME format) is one such standard but it is fairly complex. Maybe a JSON schema or RDF schema for linked information might help with that. Or something like tags or RDF triples embedded in Markdown -- something I started playing with in Twirlip15 (inspired in part by Foam).
Code for a foam-like "Ideas" app using markdown, triple parsing, and Cytoscape: https://github.com/pdfernhout/Twirlip15/blob/aa75ed1be5dc4a7...
And one example file: https://github.com/pdfernhout/Twirlip15/blob/792b067c30c7846...
It just became an append-only log for me with very limited organizational power. Though I do like it for anything just long enough where a single .txt file doesn't cut it. Tiddly is great for that case because it encapsulates the common task of jumping between the same sections over and over -- the real downside of a large file. But you aren't alone in finding it's not so great on a larger scale.
So if you did like the idea of a wiki but weren't diddly with the Tiddly, might be worthwhile to check out something like DokuWiki or MediaWiki.
The reason I'm a convert is that it seems like the best of both worlds between raw note-taking and a wiki. The advantage over raw note-taking is the links that enable you to "crawl" its entirety. The advantage over a wiki is that it's tech-agnostic and you can do it however works best for you.
On the other hand, a wiki may be better for someone who wanted to embed media in their notes (such as audio recordings).
I created a mashup of Zettelkasten + bullet journaling + a linking system based on tagging and IDs that models the fact that knowledge is both hierarchical and associative - i.e. fractal.
My co-founder and I have been building a hosted version of this for the last two years, because we recognized that while self-hosted wikis work great for techie people, there are a lot of other people who that label doesn't fit.
So we've been working to create a collaborative knowledge-base platform built around some key concepts:
1. Built around cards rather than documents, which allows for a lot of interesting and flexible features. Such as...
2. Granular sharing – on Supernotes, you can share an entire collection of cards, or you can share one card at a time. We also have recently introduced a "friends" features that allows you to quickly drag-and-drop cards onto your friends to share with them.
3. Multi-parent nesting – there is no folder-style filesystem on Supernotes, we allow you to nest cards inside of each other. On top of this, we allow for this nesting to be multi-parent, so different users can fit the same cards into their own unique structure (effectively a collaborative / personalized version of symbolic links).
4. Public vs. private tags – cards can be tagged with public tags that everyone sees, but can also be tagged privately with only tags that you can see. This same idea is reflected across the platform, where we want the underlying content to be the same for everyone but want to allow users to personalize the metadata/structure to suit their own workflow.
5. Focus on speed – we have spent a lot of time making Supernotes speedy quick, and try to make it faster every time we release a new feature.
Anyway that is the rough idea. The goal of Supernotes is to be a sort of data-layer where you can keep all these compartmentalized pieces of content (as cards) and then mix-and-match at will to create very simple or very complex stores of knowledge. We also want you to be able to embed these pieces of content elsewhere (say in a Notion document or on your blog) with as little effort as possible (not quite there yet, but will be soon).
I did scan of the faq and ended up up on the docs and searched for export.
I was pleasantly impressed with the entry which showed an export option along with text and videos showing how to do this.
For me, I probably won't be spending 300 on this when I can wiki or wordpress for free... but if I was not so jaded about saas and cloud, I would be persuaded to check out your thing if you had on the front page like "export, backup" and bonus if it was 'import/export markdown or similar files'
I'd feel less worried about vendor lockin, holding data hostage, what happens when you go bankrupt, etc
the heading fonts on your privacy page are a little wonky in my browser (firefox) -
being that you are uk and sharing data with EU and outside the EU - I'd only save info if it was encrypted.. not sure if that is a thing, if so I would make 'privacy built in' a big thing on the front page.
my cents in trying to help, I'm sure 98% of those who may use your service are not as sensitive to the same things I am - so this is not a critique saying it's bad, just offering some random thoughts as I took a look.
Data ownership is pretty important to us, even though we are only offering a hosted solution, which is why we explicitly say as much in our T&Cs. But yep, we want to make export / backup of your data as easy as humanly possible. The hard part generally is that there are a number of features that exist on Supernotes which just don't exist elsewhere, so even when you use the export feature it is hard to guarantee we can export it in a format that is useful to you.
That is part of the reason we are doing our best to openly document our API so that you can interact with your own content in whatever way you wish (including importing content from wherever or exporting to wherever). Obviously this requires some coding, but we're hoping the community will share any tools they build on top of the API with each other.
Unfortunately E2EE is not quite there yet, as it makes it much more difficult to facilitate sharing when you have E2EE, as well it being a bit of a problem when it comes to a knowledge base if a user loses their private keys and you have to tell them "sorry we can't get your content back – it's all gone". But this is definitely something we are working towards – just takes some time to nail the UX. Since we are definitely never going to sell your data or anything (as per T&Cs), it's better for us if it's E2EE as then it's just one less liability for us from a data protection perspective.
EDIT: VisualEditor, the de facto standard for pasting things like screenshots into your articles seems to be a pain to install. Got my local env up and running though.. Will report back on success with this extension.
Given the complexity of setting up WikiMedia properly, I think I'm going to keep using Obsidian.
If I ever tire of keeping a personal wiki for whatever reason, all of the content I've built up in it will remain organized as files within directories.
I highly recommend the 'Backlinks' plugin to improve the wiki functionality; leaves Roam standing in the dust for personal use.
However, backlinks are not possible without hacks. A wiki without backlinks is kind of lame and I could very well use my good old plain text files.
Have you run in to trouble when updating MediaWiki, or is it smooth sailing? SQLite is not mentioned here: https://www.mediawiki.org/wiki/Download
- future proof (at least not only a one man project)
- Fast search over all informations
- Fast creation of quick notes (inbox)
- Mobile iOS client
Currently I am stuck with Notion, which has a great 'database' concept. Which is fun to use. Sadly it's too slow. If I want to take a quick note on the go "Google for M6x40 Screws" I need 10-20 seconds with Notion.
I don't even mind paying for such service...
I wish proper wikis hadn't gone by the wayside. (I think it has a lot to do with MediaWiki's default skin being out of style, and people not realizing they can change it.) Most of all, I wish open source projects would stop dumping a bunch of Markdown in a repo somewhere and calling it a "wiki". They're not even close to comparable.
My two biggest complaints about MediawWiki are 1. PHP, and 2. no well-supported way to opt-in to a different syntax like Markdown or AsciiDoc or pretty much anything that isn't MediaWiki-flavored wikitext.
My steps work for all 5 of the wikis I tried before I settled on MediaWiki (though I don't necessarily recommend it to everyone). The install.php script might be in some subfolder, but the website instructions will tell you.
Neither DokuWiki nor MediaWiki (via sqlite) needed to have an external DB running, though some wikis do depend on MySQL.
It was just a quick summary to show how easy it is. e.g. PHP has an embedded server these days.
If using a web app, it would be better to run it on a $5 server, so if you want to type in something while you're outside with just your phone, you can do that also.
I have thousands of notes in Notes.app across every subject. And moving them into my wiki (categorizing them, linking them) was one of the first thing I did. And one of the best things I've done. Like I had all sorts of stuff in there: stories I've written, lists of things, texts me father sent me, 4 different documents where I had written down birthday/xmas ideas for my girlfriend that I never remembered to check.
These mapped very nicely to pages and categories on my wiki. I even have a page for my girlfriend (globally available on my sidebar) that now has a === Gift ideas === subheader.
One day you just might decide Notes.app is not cutting it for you and that you want better organization. Maybe you won't. I'm in my 30s and didn't do it til now.
I have mine running from Dropbox, so my other computers always have it synced. The real issue is mobile access. It's not something I care about right now but making it internet addressable is certainly something I could do in the future.
We run this in a docker container with SQLite database and
backup the database daily to another server.
The private and public pages feature fits perfectly to our use case. We show system information, how-to guides and rules on the public pages and manage sysadmin documentation with restricted access.
The main thing I'm worried about with other wiki software (including Wiki.js) is that if it's compatible with gadgets, userscripts and all of the other neat tools already available.
It doesn't have to be MediaWiki, or even a distant relative of it. It just has to work with them.
I will be happy if this Wiki.js platform does have compatibility with these features, though.
It is unclear to ne if you refer to statistics or gut feeling here. Would you mind clarifying?
FANDOM is the most massive wiki farm with over 360,000+ (as of 2016) wikis, which I'd give at lowest an estimate of 60% of the total number of wikis on the net, and is 88th on the Alexa rankings. FANDOM is a wiki powerhouse, and you bet it uses MediaWiki.
Excluding WikiHow, I have never seen a wiki not use MediaWiki. As one of the guys that hops across many different wikis and wiki farms doing automated work, I cannot stress this enough.
 Brandon Rhea, FANDOM VP of Growth https://community.fandom.com/wiki/Choosing_Fandom?diff=next&... (dated June 14, 2016)
 Alexa.com https://www.alexa.com/siteinfo/fandom.com#section_traffic (dated ~21 July 2020)
WikiHow is using mediawiki (or at least a fork of it) https://src.wikihow.com/
But there certainly exist other wikis and wiki-like projects that dont. MDN and OWASP wiki are prominent examples that moved away from mediawiki. I think mediawiki has most of the mass-collabotation market, but there is much more competition in the open-source project documentation niche (which people often use wikis for) and corporate knowledge base market.
P.s. for the interested, mediawiki has statistics at https://pingback.wmflabs.org/#unique-wiki-count (opt-in) and
https://wikiapiary.com/wiki/Main_Page (based on web crawling)
Oh wait yeah.