
Intro To Pattern Matching – Covers C# 9 - alugili
https://www.c-sharpcorner.com/article/intro-to-pattern-matching-covers-c-sharp-9/
======
moron4hire
I don't agree that growing a language necessarily makes it harder to learn.
Quite the opposite, I think it makes it easier to learn.

Look at how asynchronous code was handled in C# in the past versus today. In
the past, you first had to define a delegate type for the signature of the
procedure you wanted to run. Then you'd assign something to that delegate,
probably a method sitting in a class somewhere. You can now call BeginInvoke
on that reference, passing in a callback function. That callback function has
to be another method sitting in a class somewhere. You also get this
IAsyncResult object returned, with no immediate guidance as to what it's
supposed to do. Is it different than the IAsyncResult that gets passed to the
callback? No? There's this "AsyncState" object you're supposed to pass, with
no clue what it's meant for. If you don't need it, you pass null, which is
pretty much every case. Need to return a value? Call EndInvoke with the
IAsyncResult object you received in the callback. But wait, where's my
delegate instance on which I need to call EndInvoke? Guess I need to hoist it
to be a field in my class now. I suppose you could pass it as that AsyncState
parameter, but now you're having to do in your head the type-checking job that
the compiler should be doing for you.

None of this is guessable from just reading the code. You have to read the
documentation in multiple places to figure out what is going on. For added
fun, now try composing multiple async operations in sequence.

Contrast to today, we do "Task.Run(<some lambda expression>)". You tell me
what is easier to learn?

You see "fewer keywords, fewer features" and see "simplicity". I see those
things and I see "more boilerplate for implementing code" > "complexity".

~~~
thrower123
Yes, I am very happy that I never have to deal with IAsyncResult and
BeginOperation()/EndOperation() method pairs, with callbacks and waithandles
to make sure things happen in the correct sequence, ever again.

It was just horrible.

------
dmitriid
Too bad pattern matching implementations in C# (possibly in JS [1] and other
languages [2]) don't go beyond basically matching in switches and maybe
perhaps somwewhat inside function bodies.

One of the greatest powers of pattern matching comes from being abole to match
in function definitions. The simplest example:

    
    
      fac(0) -> 1;
      fac(N) when N > 0 -> N*fac(N-1).
    

This makes a lot of code much more readable and more concise.

[1] [https://github.com/tc39/proposal-pattern-
matching](https://github.com/tc39/proposal-pattern-matching)

[2] [http://cr.openjdk.java.net/~briangoetz/amber/pattern-
match.h...](http://cr.openjdk.java.net/~briangoetz/amber/pattern-match.html)

~~~
platz
C#'s pattern matching does not include the ability to check for
exhaustiveness, which is need to make pattern matching comparable in utility
and power to inheritance (i.e. so you can actually replace inheritance w/
pattern matching.)

Much like how the compiler assists you when writing subclasseses so that
you've implemented all the required methods, real Pattern Matching (F#) gives
you the same guarantees from the compiler that you've handled all the possible
cases appropriately.

Exhaustiveness checking is what elevates pattern matching to the other side of
the expression problem, which is about getting feedback from the compiler to
help you write programs.

a glorified switch statement does not help you do that, and so you still have
to write subclasses to get compiler feedback.

~~~
seemslegit
Nor should it, assuming a fixed set of derived classes is a bad OOP/SOLID
practice as it breaks the "objects should be opened to extension" principle
and prior to introduction of RTTI in C++ it wasn't even a design possibility -
you either specified the concrete type or relied on abstraction and
polymorphism.

~~~
platz
That's not what 'fixed set' means here.

It's 'fixed' in the same sense as the compiler knowing what are the subclasses
of a base class.

Just as you can always 'extend' the OOP base class by adding a subclass, you
can always 'extend' an algebraic data for by adding another case.

Both can be extended, but more importantly they can be _enumerated_.

~~~
seemslegit
That's correct, and exhaustively enumerating derived classes at compile-time
is usually a bad design either via RTTI type-switching or pattern matching
(which is really just syntactic sugar for the former). It prevents for example
having derived types injected at run-time in a plugin setting. Explicitly
matching a subset of well-known derived types otoh is a practical compromise
in many cases.

~~~
platz
> enumerating derived classes at compile-time is usually a bad design

Depends on what kinds of guarantees you want. At the extreme, you can make
everything in the system late-bound, and then you simply have a dynamic
language instead of a static one.

~~~
seemslegit
> enumerating derived classes at compile-time is usually a bad design did you
> deliberately omit the word 'exhaustively' here ?

late-bound != dynamically typed, that's exactly the point - you wish to
preserve to the ability to have types loaded at run time without knowing their
exact type at compile time while still enjoying strong type guarantees. We can
do without run-time loading scenarios at all and still see why this is bad, if
one adds a derived class to the code base and then has to address it in a
bunch of places rather than have all the new functionality localized to the
class then it's a smell

------
tasogare
I would love if the not operator is usable instead of ! in other places than
pattern matching. It's something I've wanted for a long time.

~~~
LandR
Do you mean the word 'not'?

------
moron4hire
I'm just going to leave this here, for all the "X > Y" talk, inserting either
OO and FP into X and Y in whatever order you want:
[https://en.wikipedia.org/wiki/Expression_problem](https://en.wikipedia.org/wiki/Expression_problem)

~~~
Twisol
Several languages on either side of the OOP vs. FP divide include both
solutions to the problem -- for example, open subtyping vs. sealed case
classes in Scala, and traits vs. sum types in Rust. There's more to both OOP
and FP than the expression problem.

~~~
moron4hire
That's my point. There are people who claim "my-current-shiny is the only
way". Yes, OOP and FP languages have different solutions to the problem,
because it's a real problem. You need to have _both_ OO and Functional
features in your language. "OOP vs FP" is a false dichotomy.

~~~
pjmlp
Indeed, in fact most successful mainstream languages including both paradigms,
they just differ on the default approach (FP first vs OOP first).

------
fiveminds
what about Active pattern? will come to C# or not?

------
brainzap
oh my god people still do class Car examples. So hard to use a real world
example with all this open source code...

~~~
moksly
They do. I have a side job as an external examiner for CS students and the
first years are still taught the same Car/Animal examples when learning about
polymorphism that I was decades ago.

I’ve never once used that knowledge professionally to any sort of success. The
complexity of polymorphism always ended up biting us (any company/org I’ve
worked at) in the ass when things eventually grew way too far apart than what
was originally envisioned when the polymorphism made sense.

Kinda hilarious that they are still teaching it with those useless examples
though. You’d have thought they could come up with something better by now. I
mean, I get why you’d chose something obvious, but in my experience CS
students seem to understand polymorphism perfectly well without making a duck
quack and a dog go woof.

But what do I know, I can’t even make it work.

~~~
mjburgess
I doubt there's any programme you've ever written that makes no use of
polymorphism (the contextual resolution between multiple semantics to a term).

Even 1 + 1 is polymorphic, ie., "1" \+ "1" selects for a different behaviour.

Method overriding may rely on inheritance, and the problem is usually the
inheritance not the polymorphism.

~~~
nightski
They did not specifically state it but I believe he/she was referring to
subtype polymorphism. Parametric polymorphism as you describe is far more
useful imho.

~~~
mjburgess
There are many types -- it is phrased generally and may lead readers to
believe that there is some issue with polymorphism.

I still object on the subtype grounds though -- because the issue is not the
polymorphism (that is useful and usually a good idea). It's the inheritance
relation, which is not the subtype relation -- eg., you can do subtype
polymorphism with just implementing interfaces and there shouldn't be any
problems.

This comment is actually expressing issues which have arisen because of
inheritance and confusing it with polymorphism.

------
tartoran
One of the advantages of F# is the simplicity and ‘graspability’ it yields but
I have nothing against C# adopting more functional features. I just think this
will confuse beginners quite a bit so the ladder is getting sparser rungs in a
way. Maybe it is a good think maybe it’s not, only time will tell

~~~
mysonytv
exactly this is the problem.I think small language with limited features is
better of one language which can do everything and it need a lot of expertise.
Simple is beautiful. If the developers need FP than why not F# integrated in
C#?

------
melling
We’re all slowing moving to functional programming.

One day we wake up and decide we basically know Haskell, OCaml, etc and take
off the training wheels?

~~~
mumblemumble
I don't think so. Pattern matching and the like are just window dressing. The
two true distinguishing features of Haskell are that it's lazy, and the
compiler enforces referential transparency. Neither of those is something you
can get by simply adding features to an existing language.

And, if you aren't working within those strictures, you're still just not even
in the same universe. No matter how many variables you make immutable, you'll
still be able to log events by just executing a log statement, and you'll
still be able to create a function that talks to a remote service, and call it
from anywhere. So, you might use a few functional-inspired features when
they're convenient, but the actual structure of your program is probably still
imperative to the core.

~~~
sweeneyrod
By this standard OCaml and Lisp aren't functional either.

~~~
mumblemumble
I wasn't intending to imply that they aren't. The standard being proposed when
I originally wrote my comment was just "Haskell"; I think the "ocaml, etc" was
a later edit.

