Hacker News new | past | comments | ask | show | jobs | submit login
Alive v1.0 – Live Programming for C# (comealive.io)
125 points by Permit on Oct 27, 2015 | hide | past | web | favorite | 47 comments

It's really nice to see a tool like this come out for a statically typed language. When I started learning programming 2 years ago, tools like Light Table and the live editing capabilities of Seaside for Smalltalk were not only a huge help but are something I've come to miss in other languages.

I think that live coding tools for the big 2 languages (C# and Java) could be a great boon to students learning programming in college, where at least in my area statically typed languages seem to be the norm.

Xerox PARC already had it for Mesa/Cedar in the early 80's, which was the inspiration Niklaus Wirth had for his Oberon system.

Which was a statically typed systems programming language with RC and local GC.



The progress we have lost with mainstream ignoring Xerox PARC research in programming environments.

Those workstations already had something like IPython and Swift Playgrounds available.

PARC/Cedar looks real, real cool. I've heard them in passing but the earliest machines I got to use were the mid 90s Sun/SGI era and AS/400s.

Local GC as in, as soon as you go out of scope you free? Or is it doing something more complicated than that. I love all these old/research languages/environments, so many cool ideas. Thanks for the archive.org PDF.

PS: If you're interested in PARC, you might be interested in Brian Beckman, et al's (of MSR) "TimeWarp" http://www.cs.nyu.edu/srg/talks/timewarp.pdf. It ran into resource scarcity issues at the time but that's no longer an issue. Old PoC's are really interesting to revisit now that our main issue isn't space but latency.

(Fun fact: A CPU -> Northbridge RAM fetch is only ~5x (~60 ms[1]) as fast as a prosumer eMLC SATA3 fetch (~200ns) . Data segmentation matters guys, if you are going to have a cache miss. Sequential data matters and predictable prefetching matters, and isn't "pre-optimization" if you're dealing with low-latency stuff!")

(1): https://software.intel.com/sites/products/collateral/hpc/vtu...

The local GC was a cycle collector.

So traditional RC with GC for the cycles, if any.

Thanks for the link.

Isn't this like Playgrounds for Swift, which is statically typed. Playgrounds are still a bit hard to work with IMO. Can't build interactive apps, for example. And I can't figure out UIView animations:


However, they are a small step in Bret Victor's vision.

Actually, I completely forgot about Swift, along with the fact that it will be coming to Linux.

I find the `for` example really bad. Mainly because 99% of the apps (I totally made up that number) are NOTHING like that.

You usually are using DI, so you'll have an interface that you need to resolve and do stuff.

How is alive going to deal with that?

The examples shown are deliberately simple and meant to show off the concept.

Alive isn't performing static analysis and guessing what your code is doing, it's compiling and running it while showing you the results. So as long as you have unit tests covering a piece of code, we can run those tests and show you what that piece of code does.

It shows what the piece of code does in the unit tests? Or will it pull in the mock values used in the unit tests and show you their values on the actual code? And if there are multiple unit tests with different mock values, how does Alive pick which value to use?

Here's an example on Newtonsoft.Json: https://www.youtube.com/watch?v=9AR1m3llrvg&feature=youtu.be

And here's another on using tests with Alive: https://www.youtube.com/watch?v=qJgMqTTKPBg&feature=youtu.be

I agree if it had examples and support for the IEnumerable 'where', 'Select', etc and foreach loops I'd easily through down the $100 for myself but I just don't write many 'for' loops anymore

Thanks! That's a good point. We will use IEnumerable and LINQ in the next example that we'll make

Do a lot of people use DI?

I thought that was all a bit of a fad like factories. Lots of extra code for little benefit. Especially in a statically typed language like C# where it gets rid of so many of the benefits of using it as a language.

How does it remove anything beneficial?

I look at DI, or really the dependency inversion pattern, as a guard that encourages better/more maintainable code. Can you write good code without it, sure, and you can write bad code with it, but it's much more likely that code written with it is easier to trace and maintain. Not to mention much easier to test.

The configuration based, hot-swapping of dependencies...that I've never seen much need for even in larger projects.


Since this is a C# thread, there are a number of ways to navigate through the interface to the implementation in Visual Studio:

1. Right-click, Go To Implementation (Ctrl-F12)

2. Right-click, Find All References (shows the interface and impl)

3. (Resharper) Navigate, Dervived Symbols (Alt-End)

Even the new method info in Visual Studio 2015 (2013 Enterprise) is able count and show you the usages as a tip through the interface.

I don't experience any issues with this pattern; the tooling is built around it.

4. There's a plugin for ReSharper called Agent Mulder that examines your AutoFac configuration and knows when something is registered and can take you right to it.

DI usually means Dependency Injection, and I don't see how I could do unit testing/mocking without DI (especially constructor based DI). IoC containers such as AutoFac, those are from my perspective more of a fad.

Why do you consider them a fad?

I've found it immensely useful to not go through the rigamarole of newing up a ton of objects manually in my web controllers and queue/batch processing frameworks. Just choose your composition roots properly and you can easily add new endpoints/jobs/event processors without all the worry of resolving all the dependencies manually.

It also allows you to reduce footprints (setting non-stateful code/configuration objects as singletons), transparently cache and instrument(dynamic proxies, or application specific caching/logging wrappers), and other similar application-level concerns without it infecting consuming code or otherwise independent modules.

By fad, I meant that the frameworks change in the world of IoC, while DI is pretty much a concept easily applicable in OO languages.

I myself and my team use AutoFac right now, but who knows what it could be in 1-2 years...

In the .NET/Enterprise world, absolutely.

> Do a lot of people use DI?

I use it mostly for unit testing.

Yep. You can't see the Emperor's clothes, either?

Alive looks super cool, but the $99 price for individual devs feels pretty steep to me. It's basically just a different way to interact with the debugger during test right? So im not doing so much pause, edit, continue activity? Or am I am missing something?

If it saves you even an hour of time over the course of months or years then why wouldn't it be worth it?

With that being said, if it's not yet "stable" then I agree, a hundred bucks is quite pricey for an unfinished dev tool.

You can't spread the cost out of "years" because the individual license is only good for a year. I won't complain about the price because that is a value judgement that everyone will have to make on their own. However, it is always disappointing to see tools like this that an individual can't simply buy. Instead you have enter into a yearly licensing deal in which you have no idea what this software will cost in 12 months or even whether the company behind it will continue to be in business. These type of yearly licensing deals are the norm in enterprise environments, but they are harder to justify as an individual developer.

Agreed, Jetbrains is moving to this model and it is disgusting.

They backpedaled mostly and offer perpetual licenses.

I just wish they'd simply added subscription options in the first place vs going through all the hullabaloo. Perhaps it was a smokescreen, but that just feels very conspiratorial.

mostly but not completely. After your year you have to downgrade to the version as it was when you originally bought. Removing bug fixes and new features you have got used to over your year of use.


That's what my 'mostly' meant. You can simply pay a 12 month fee and buy it 'in advance' and just use what you get, vs thinking about it as "I'm losing bug fixes". I'm still using PHPStorm 7.1 from a couple years ago, and it still works. Same concept would apply going forward, but... I still think they handled this wrong.

Would I spend $100 to save myself an hour over a couple of months? Probably not, no. Not as an individual dev who could spend 100 bucks and get much more value for it. I could buy intellij for for the same price for instance.

Just a side note: very many Java developers mention IntelliJ.

IntelliJ is good but I prefer the free Netbeans even when I have access to the full IntelliJ paid by the company.

Last time I checked it out it seemed pretty easy to crack, if anyone is that desperate. Didn't do it though.

That looks very cool. Is it going to crush my processor? I'll give it a shot after I'm done with my work.

No, all the processing is asynchronous.

Depending on size of your project and complexity of the code, we can update within <100ms to a few seconds after your keystroke.

Is it going to crush my processor?

No, all the processing is asynchronous.

That's not an answer to the question. If every keystroke kicks off a compile-execute-report cycle, then the processor (and drive) are definitely going to be taking on a lot of additional load. It doesn't matter that it's asynchronous; all that does is prevent latency between keystrokes so long as your machine can keep up with the additional load.

I use ReSharper, with pretty much default options because it's too much of a pain to figure out the magic combination of hundreds of options that will improve performance without disabling the features I like to use. ReSharper does its work on a background thread, but that doesn't stop it from making VS crawl when I open up a large solution after doing a scripted (external) rebuild. Async != Free Work.

Yup, I was gonna say this. We use ReSharper, but even on our modern dev machines visual studio can really crawl after building.

You're right, I didn't give a straight answer. Alive will use the CPU and the disk, but your computer won't be crawling nor crushed. And when you're done, just click the button to stop Alive, and it will stop - for real.

I like it!

There's a ton of stuff you can do in this space: lint-type tips, RT TDD, code "explaining", path identification, etc.

Keep up the good work! Would love to see more of this as it comes along.

This should be combined with Code Digger, it's a tool for C# that can detect all the possible ranges of input to a function by working in reverse from all the possible outcomes. If a function might throw an Exception somewhere it will show you the input that leads to that code path.

For just running tests, I found http://www.ncrunch.net/ to be very handy. It puts a green/red/black dot before each line of code to indicate how it does in your unittests.

Pretty cool. Would it play nice if my code uses Dapper or Linq2SQL?

It should! Fair warning, if you're not using mocks in your tests, it will be making network calls on every valid compilation.

If anything isn't working you can report it on our public issue tracker: http://github.com/CodeConnect/AliveFeedback/

How is this different to http://www.ncrunch.net/ which runs tests as you type, so you know if tests are passing before you even save the file!

Edit: Well that looks creepy :) I don't know ragebol and apart from being a happy user I don't have a relationship with the nCrunch dude!

I'm not the developer of either product, but it seems like Alive has Roslyn based which offers direct access to every step of the compiler. Right now it doesn't look like much difference (in fact I bet NCrunch is more stable and has better integration with NUnit and the rest of the ecosystem), but I bet (purely speculating) that Alive intends on expanding their feature set into something more extensive.

Performance wise, assuming the Alive developers use Roslyn efficiently (which admittedly is difficult), they could offer a lot more semantic/syntax analysis at a way faster rate (i.e., think about solutions with 10s of projects and hundreds/thousands of classes, while having a dynamic dependency graph available-- unit tests, invariants, and even property-based QuickCheck-esque testing could easily at type-time [as in after the keydown event] or a few hundred ms after).

These are all capabilities inherent of being Roslyn based though, nothing too special about Alive, just a benefit it has over NCrunch should they choose to go down that road.

This is great, I wish more people where into live programming, it is the future.

Woah! Make it work with Unity3D and in general with graphic libraries.

Are there any libraries in particular that you're interested in?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact