If the Amyloid hypothesis is wrong, why does this drug seem to work so well?
I think everybody should humble themselves a bit and recognize that we don't fully understand the mechanisms of Alzheimer's yet. It's a complicated disease, and a lot of the techniques we use to understand it are still in development.
(Full disclosure: I have been involved in MRI and PET imaging arms of industry-funded trials such as this one).
First off, there's no question that Amyloid plays a role in the development of AD.
It is however becoming increasingly clear that it does not play the primary causative role. Its exact role is currently unclear: its likely both a secondary neurotoxic agent and an epiphenomenon. It's also non-specific (ie. Amyloid deposition is seen in several other disease states as well as in many cognitively normal controls).
There are certainly other agents and mechanisms that are both more important in the development of AD and arise earlier in the time course of the disease.
This particular drug has a modest effect (From this study - it slows down disease progression, at least in the first year). The several dozen other anti-amyloid agents that have been trialled, at billions of dollars of expense, have either had little or no effect or demonstrated a modest effect in the first 12-24 months treatment before becoming ineffective again.
Its also worth noting that the manufacturers have a very limited understanding of the mechanism(s) of action of the drug. So, there's no guarantee that any effect is actually attributable to the reduced amyloid deposition, and not some secondary process.
I'm a neuroradiologist, so I mostly interpret images (MR, CT and PET mostly).
I'm involved in various research studies where I am involved in study design, sequence selection, image interpretation, clinical correlation etc.
Most of the techie stuff gets done by physicists and some of my collaborators in other areas of neuroscience but I do enjoy doing some of my own (rather rudimentary) pre- and post-processing of MRI data.
What sort of image processing do you program for? VBM?
no, both concept and implementation is bad. This thing shovels and computes a LOT of data per call, you cant optimize that out because its what authors of kdbus intended.
Its typical 'look at all that CPU we have now, lets use it' mentality that keeps Wirth's law true.
The point of kdbus is to do dbus in kernel space. The one spending lots of time in userspace overhead (not doing actually useful work) is regular dbus.
Linus is saying that kdbus is pointless because its performance gains don't come from being in-kernel, they come from the code not being a complete shit-show, and he believes the same performance should be achievable by fixing regular userspace dbus.
I made a silly desktop app some time ago and it used DBUS to get notifications from NetworkMonitor when the system went online/offline. Nothing too fancy, very few lines of code.
When I was implementing that, I managed to get several segfaults from my Python code. All together seemed a little bit fragile to me :(
That was 5 years ago, things are probably better now (to be fair, I don't know what was causing the crashes; NM was 0.8 back then), but when I read Linus comments I can help to think he's probably right.
This is an anecdote and all, but my point is that until I had to use it... DBUS was pretty good and was working fine :)
Given that, I was able to use dbus to come up with a quick solution for a department feature in an afternoon using Pidgin's libpurple dbus bindings through purple-remote to create a presence tracker and simple announcement bot with bash.
A little digging into Pidgin's DBUS Howto gave me all the documentation that I needed. It just took a lunch hour to have something functional up and running without any real in depth programming knowledge required or needed. In an afternoon, I'd hacked up an RSS feed that showed everyone in the department's current presence as reported by their IM status and that feed could then be consumed by the required apps.
I don't know about dbus' other merits or flaws, but it did help me solve a specific problem quickly. This was a dead simple use case though.
At the time, I assumed that people more knowledgeable than me could do a hell of a lot more with it.
Two years ago I had to use DBus to connect to a Bluetooth device from Java.
In the end, after a few minutes of good work the connection between the adapter and the peripheral would timeout (or someone would go out of range), the whole thing would stall and we'd never get another message from BlueZ, be able to connect to any device ever again. It was without doubt the worst development experience I ever had.
Worse, you couldn't reload the dbus library and start over because then Java would scream and crash. So we had to restart the JVM, and BlueZ, etc.
I'm sure I was doing something wrong, but if I can't get it to work properly in a matter of weeks then it's not just my fault.
Were you using threads? Python really should not crash, but combine it with library imports implemented in c and threads, and you get really subtle race conditions which result in segfaults. I have myself been force to debug why Python segfaulted, and it was a standard library call which internally used threading, and that caused a conflict with a c library that was not threading safe.
I didn't mean to say my Python code was the one crashing with a segfault. Apologies if my comment wasn't clear enough.
I just did a search in my bugzilla account at Red Hat (Fedora distro) but I couldn't find a report for that specific crash. The closest I can find is a report on a crash of notification-daemon (could be related though, as it uses DBUS to advertise a service).
I'm surprised I didn't file a bug report, but there you are.
Oh, yes, we do. We need a language targeted at humans that is machines can
process and we need a language targeted at machine processing that is
inspectable by humans. Those are two contradicting targets: the latter calls
for simplicity, but the former calls for shortcuts[#] to make human's work
easier, which adds, not reduces, complexity.
[#] Shortcuts like not quoting keys and omitting braces and commas in
You are only considering the cost of infrastructure. What about the basic salary for the people working on the application and monitoring infrastructure, who will pay for that, if not users and not ads and not government?