
Why is .NET reflection slow? - matthewwarren
http://www.mattwarren.org/2016/12/14/Why-is-Reflection-slow/
======
thepumpkin1979
In .NET, You're supposed to use reflection for discovery and then generate the
code, at runtime, in a Assembly which is loaded and JIT'Ted by the runtime
just like any other library.

~~~
velox_io
Reflection is fine, just never use it inside a tight loop/ hot code path.

I've written a database engine in C#/ .net and I quickly learnt what you can
and can't use. Using the dynamic/ expando parts have insane memory costs.

You have to be very careful what language features you use. If you want
performance then static methods are your friend, as are Generics (sometimes
sealed classes), arrays & vectors (you start to realise how important CPU
cache is). Memory streams & pointers are sometimes needed, but gains are often
less than you'd think.

Visual Studio has some really good debugging tools, especially viewing the
disassembly (hit a breakpoint then ctrl+alt+d) to see what is really
happening.

I love Linq, but it can be awfully slow.

Lastly, when you run tests. Make sure you not only do multiple runs, but also
pause in-between. I've seen the jit yield >20% improvements in micro-
benchmarks

I will blog about this stuff one day!

~~~
eropple
_> I will blog about this stuff one day!_

I'd love to read this blog post.

One thing to keep in mind, for readers--while LINQ _can_ be awfully slow, it
often _isn 't_ and should be your first stop (and then improving on it when
you find hot paths). I build game engines in C#, and I happily use LINQ to get
something done--then I go back, profile my code (anything you're not getting
out of your profiler is likely mythical and at best lying to you), and replace
as-needed.

~~~
WorldMaker
Optimizing LINQ is a lot like optimizing Haskell (or to a lesser extent F#)
and it isn't so much a minefield as it is dealing with that sort of abrupt
paradigm-shift/learning curve wall of a mini embedded functional programming
language.

I don't think a lot of people understand exactly where some of the eagerness
versus laziness boundaries are (which are essentially monad boundaries, from a
functional perspective). For instance, I've suggested for some time [1] that
one of the strongest things a UX (whether an IDE or an extension to an IDE)
could do to help is better mark those boundaries between
IQueryable/IEnumerable/IList, as I don't think a lot of people see them. (I've
long considered ToList somewhat harmful, especially; it is so painfully rarely
the right tool for any task and yet is so easy for people new to LINQ to use
as a crutch.)

Beyond that there is of course the general rules of optimizing any code: are
you using the right data structure for the job and have you balanced caching
the expensive operations (queries) against what isn't expensive and doesn't
need to be cached.

Certainly a lot of optimizations I've seen needed over the years are simply:
this lookup would be faster in a Dictionary or a HashSet or an ILookup,
instead of an O(n) search through an IEnumerable or IList. (This should be
especially obvious any time that you attempt a `join` of any sort against an
IEnumerable or IList on either side of the `join` that you need a Dictionary,
HashSet, or ILookup on one or both sides.)

That sort of data structure optimization is largely the same consideration
when you are writing do/while/for/foreach loops directly as it is in the LINQ
syntax and it can take a mental leap for developers sometimes to realize it
that it is mostly the same problems, even if they sometimes "feel" different.

[1]
[https://1drv.ms/p/s!AkQwwfqxgHzNxOsAj1KSVqxGaiydnQ](https://1drv.ms/p/s!AkQwwfqxgHzNxOsAj1KSVqxGaiydnQ)

------
wmccullough
I think it's nice to see this type of analysis. We all know that reflection is
slow for most languages, but understanding the why gives me a great sense of
how interesting the task of reflection is.

~~~
matthewwarren
This was exactly my reason for writing it :-)

------
themihai
I think reflection is "slow" is most languages unless you use a kind of
preprocessor. Is there any exception?

~~~
chrisseaton
> I think reflection is "slow" is most languages unless you use a kind of
> preprocessor. Is there any exception?

Yes! I wrote a PhD on making reflection in Ruby fast. It is possible to make
most reflection operations run with no peak time performance overhead at all
compared to normal operations.

[http://chrisseaton.com/phd/](http://chrisseaton.com/phd/)

The main techniques needed are dynamic code compilation with speculation and
deoptimisation, polymorphic inline caching, splitting and method inlining.
Unfortunately I think the .NET JIT isn't dynamic (might be mistaken, not an
expert on .NET) so can't do speculative optimisations.

Basically you see what method names have been used last time in a reflective
call, and create a little call that's more like a conventional call using
those names. Each time you do the reflective call you check the name against
the list of names you have created conventional calls for, and use them if you
can. You need to make string comparison fast, which can be done using a rope
data structure. Then you need to be able to remove the check entirely if the
string comes from somewhere you can control, like a constant. Then you need to
inline the reflection method so you are just left with the synthetic
conventional call.

So it isn't easy! Which is why it isn't usually done. But it's possible.

In Ruby you need to make it fast no matter how hard it is, as Ruby libraries
tend to use reflection (they call it metaprogramming) in inner-loop
operations.

~~~
AlphaSite
The JVM does a lot of this, right? I know there is a ton of speculative
compilation for C2 and grall but is it done for reflection?

~~~
matthewwarren
The JVM uses something called 'inflation' to make reflection faster.

See
[https://blogs.oracle.com/buck/entry/inflation_system_propert...](https://blogs.oracle.com/buck/entry/inflation_system_properties)
and [http://stackoverflow.com/questions/10082523/java-what-is-
jit...](http://stackoverflow.com/questions/10082523/java-what-is-jitcs-
reflection-inflation) for more info

------
taspeotis
Hi Matt, thanks for your in-depth .NET articles. They do a good job casting
light on CLR internals.

~~~
matthewwarren
Thanks, glad you like them and find them useful!

------
halestock
Does anyone have any comparable analyses for Java? I would assume the
performance penalties are similar. There are many popular libraries which rely
on annotations (JAX-RS, Jackson, Guice/Spring DI) and this presumably uses
reflection heavily, so I've often wondered what kind of impact this has vs not
using annotations.

~~~
winteriscoming
There are libraries out there, like Jandex[1] which use indexes to avoid
repeated reflection for annotations.

[1] [https://github.com/wildfly/jandex](https://github.com/wildfly/jandex)

------
__s
It'd be nice if reflection could use generics. Often passing around typeof(T)
into reflection

~~~
n00b101
I'm not sure what you are asking for, but you can find the generic type
arguments of any object at runtime:

Code snippet:

    
    
      public class Foo<T>{}
      Foo<int> foo = new Foo<int>();         
      Console.WriteLine(foo.GetType().GetGenericArguments()[0]);
    

Output:

    
    
      System.Int32
    
    

Complete code:
[http://rextester.com/RAQO69918](http://rextester.com/RAQO69918)

~~~
eropple
I think he may mean that a lot of reflection methods take Type rather than a
generic parameter? A lot of _those_ have generic overloads now though.

~~~
naasking
What would really make reflection fast is the dual operation of typeof(T) ->
Type, that is, Type -> T.

Then you can do all kinds of neat but efficient things by exploiting the CLR's
runtime generics [1]. This is something that would be easily done in the CLR
itself, a sort of hidden universal method that could perform a single, maybe
double, dispatch into the correct generic overload, but alas, we are left only
to simulate this using poor imitations of inline caches.

[1]
[https://github.com/naasking/Dynamics.NET](https://github.com/naasking/Dynamics.NET)

~~~
matthewwarren
Cool, Dynamics.NET look like a cool library, I'd not heard of it before,
thanks

------
DonHopkins
Here's an article about optimizing reflection on Unity AOT platforms like iOS:

Mike Talbots Blog: Faster Invoke for reflected property access and method
invocation with AOT compilation

Article: [https://web-
beta.archive.org/web/20120626190355/http://whydo...](https://web-
beta.archive.org/web/20120626190355/http://whydoidoit.com/2012/04/18/faster-
invoke-for-reflected-property-access-and-method-invocation-with-aot-
compilation)

Source: [https://web-
beta.archive.org/web/20161005070240/http://www.w...](https://web-
beta.archive.org/web/20161005070240/http://www.whydoidoit.net/Downloads/DelegateSupport.cs)

Newer version: [https://gitgud.io/TheSniperFan/unityserializer-
ng/blob/maste...](https://gitgud.io/TheSniperFan/unityserializer-
ng/blob/master/Assets/Plugins/OpenUnityTools/unityserializer-
ng/Radical/System/Delegates/DelegateSupport.cs)

"The bane of the iOS programmers life, when working with reflection in Mono,
is that you cant go around making up new generic types to ensure that your
reflected properties and methods get called at decent speed. This is because
Mono on iOS is fully Ahead Of Time compiled and simply cant make up new stuff
as you go along. That coupled with the dire performance of Invoke when using
reflected properties lead me to construct a helper class."

"This works by registering a series of method signatures with the compiler, so
that they are available to code running on the device. In my tests property
access was 4.5x faster and method access with one parameters was 2.4x faster.
Not earth shattering but every little helps. If you knew what you wanted ahead
of time, then you could probably do a lot better. See here for info:"

"You have to register signatures inside each class Im afraid. Nothing I can do
about that."

[http://stackoverflow.com/questions/1116073/can-delegate-
dyna...](http://stackoverflow.com/questions/1116073/can-delegate-
dynamicinvoke-be-avoided-in-this-generic-code)

If you need to extend this to wrapping member invocations from classes without
using Reflection.Emit you can do so by creating a series of compiler hints
that can map a class and a function parameter list or return type.

Basically you need to create lambdas that take objects as parameters and
return an object. Then use a generic function that the compiler sees AOT to
create a cache of suitable methods to call the member and cast the parameters.
The trick is to create open delegates and pass them through a second lamda to
get to the underlying hint at runtime.

You do have to provide the hints for every class and signature (but not every
method or property).

I've worked up a class here that does this, it's a bit too long to list out in
this post.

In performance testing it's no where near as good as the example above, but it
is generic which means it works in the circumstances I needed. Performance
around 4.5x on reading a property compared to Invoke.

------
DonHopkins
I've been researching JSON serialization/deserialization libraries for Unity,
and I'm writing a spreadsheet comparing the different alternatives.

Unity JSON Libraries: [https://docs.google.com/spreadsheets/d/1NoIYASJR9uUT-
kZxjb54...](https://docs.google.com/spreadsheets/d/1NoIYASJR9uUT-
kZxjb54zWX2Lr3EXp8F7t7N62yo7J8/edit?usp=sharing)

Unity is limited by the fact that it uses an older version of .NET 2.0, or
even preferably a .NET 2.0 subset. It can also be optimized by disabling
exceptions. And most importantly on AOT-compiled platforms like iOS and
consoles, it doesn't support dynamic JIT compilation and interfaces like
Reflection.Emit, which some of the high performance JSON serialization
libraries use for optimization.

So the question of "Why is reflection so slow?" is very interesting to me as a
Unity iOS developer, since I'm forced to use it without all the fancy
optimizations (or figure out how to avoid using it).

Some of the big fancy luxurious all dancing all singing JSON libraries
intended for less restricted CLR platforms like Windows do all kinds of
amazing tricks to reduce and optimize their use of reflection. Some of those
have been stripped down and ported to Unity, but others haven't.

JSON.Net is one of the most advanced feature-rich C# JSON libraries, and it is
free, but it is not the fastest. The free version doesn't run on Unity out of
the box, but there is a striped down simplified version that runs on Unity
available on the Unity Asset Store for $25.

JSON.NET home page:
[http://www.newtonsoft.com/json](http://www.newtonsoft.com/json)

JSON.Net source code:
[https://github.com/JamesNK/Newtonsoft.Json](https://github.com/JamesNK/Newtonsoft.Json)

Unity JSON.Net port:
[https://www.assetstore.unity3d.com/#!/content/11347](https://www.assetstore.unity3d.com/#!/content/11347)

FastJSON is quite fast true to its name, free, has a lot of advanced features,
is deeply customizable, is highly optimized and benchmarked against other
libraries, and there's a great in-depth article comparing it to other
libraries and explaining its optimization techniques. It hasn't been ported to
Unity, but that might be worth doing, since it's such a nice piece of work. I
asked the author about it, and he said that its speed was the result of
dynamic code generation features that Unity doesn't support. That would
sacrifice some of the optimizations and advanced features, but still might be
worth doing.

FastJSON article and benchmarks:
[http://www.codeproject.com/Articles/159450/fastJSON](http://www.codeproject.com/Articles/159450/fastJSON)

FastJSON source:
[https://github.com/mgholam/fastJSON](https://github.com/mgholam/fastJSON)

Full Serializer is a well written free JSON library with lots of useful
features that is specifically designed for Unity from the start, which avoids
using anything that will limit optimization, like anything higher than .Net
2.0 subset features, advanced reflection features, LINQ, code generation and
even exceptions. So far, it seems like the best balanced compromise I've found
between Unity's limitations and a full set of useful features, and it looks to
me like well written but not overly complex code.

FullSerializer source:
[https://github.com/jacobdufault/fullserializer](https://github.com/jacobdufault/fullserializer)

Thread asking about using FastJSON on Unity, where the FullSerializer author
chimes in:
[https://www.reddit.com/r/Unity3D/comments/2a4c7p/using_fastj...](https://www.reddit.com/r/Unity3D/comments/2a4c7p/using_fastjson_in_unity/)

JSONObject is another free simple JSON library written for Unity, which I've
been using for years, and while it works for basic stuff, I'm not very happy
with it (which is why I'm looking around for alternatives). It's not
particularly efficient (in speed or runtime size), has a terrible API, few
advanced features, and some embarrassing bugs. (It doesn't correctly handle
string escapes like \r \n \u1234 etc -- c'mon, there's a very simple explicit
JSON spec that defines every bit of the syntax: follow it!)

JSONObject source:
[https://github.com/mtschoen/JSONObject](https://github.com/mtschoen/JSONObject)

JSONObject on the Asset Store:
[https://www.assetstore.unity3d.com/en/#!/content/710](https://www.assetstore.unity3d.com/en/#!/content/710)

Unity now has its own JSONUtility module that at first glance seems like it
might be useful, but once you try to actually use it for anything in the real
or virtual world, you hit a wall, because it's a wolf in sheep's clothing.
Even worse than a wolf: it's layered directly on top of Unity's infamous
serialization system, and inherits all its bizarre quirks and limitations,
making it practically useless (and probably slow).

[https://docs.unity3d.com/ScriptReference/JsonUtility.html](https://docs.unity3d.com/ScriptReference/JsonUtility.html)

[https://docs.unity3d.com/Manual/JSONSerialization.html](https://docs.unity3d.com/Manual/JSONSerialization.html)

First, in order to begin to understand how truly fucked up Unity's JSONUtility
library truly is, you have to understand how truly fucked up Unity's
serialization system is, and this great classic blog posting by Lucas Meijer
bravely and honestly touches the surface, then falls off into the deep end:

Serialization in Unity: [https://blogs.unity3d.com/2014/06/24/serialization-
in-unity/](https://blogs.unity3d.com/2014/06/24/serialization-in-unity/)

While this reddit discussion goes to the heart of the matter:

Unity Serialization is truly FUBAR:
[https://www.reddit.com/r/Unity3D/comments/2e9vlg/unity_seria...](https://www.reddit.com/r/Unity3D/comments/2e9vlg/unity_serialization_is_truly_fu)

For many Unity developers, an important reason for using a JSON serialization
library is to avoid using Unity's serialization system. So the idea of
JSONUtility being layered on top of the system they were trying to get away
from doesn't appeal to them.

"This is why I am moving away from scriptable objects style services and into
using static clr objects that load their configuration from csv / json txt
files. I am loosing editor integration, but Im gaining peace of mind that
unity wont blow up and corrupt my array of 100 items, levels, whatever."
[–]nventimiglia Expert 2 points 2 years ago
[https://www.reddit.com/r/Unity3D/comments/2e9vlg/unity_seria...](https://www.reddit.com/r/Unity3D/comments/2e9vlg/unity_serialization_is_truly_fubar/cjy86ho/)

I'd prefer to build on top of Unity's built-in optimized JSON parser if that
were possible, layering a system like Full Serialization on top of it without
tangling with Unity Serialization, but the current JSONUtility API makes that
impossible. Unity should publish the source for JSONUtility to the community
(and throw in Serialization too please, so we can at least understand how it
works when we're forced to interact with it), and accept pushes, please!

~~~
DonHopkins
The main issue with Unity's JSONUtility is that it doesn't provide a generic
polymorphic runtime representation of JSON objects (like Full Serializations's
fsData), since it only maps between JSON and C# objects.

So in order to read or write a JSON object that has a key "foo" there MUST be
a C# class at compile time that has a serializable field named "foo".

A C# dictionary containing a key "foo" does not have a field named foo.

And there is no way to directly map between JSON and a variant type like
fsData, because of the limitations of Unity's serialization system. I tried
implementing a variant type like fsData (actually copying the source to fsData
and changing the class name), and then implementing the
ISerializationCallbackReceiver interfaces on it. But I hit a wall since
On{Before,After}Deserialize has no access to the JSON dictionary, because the
only keys of the dictionary that get copied to the C# objects are keys whose
names are the same as statically compiled C# object fields. And
On{Before,After}Serialize can't directly create the JSON dictionary, it must
just fill out public fields whose compile-time names became the names of keys
in the JSON object, so the JSON object will always have the same keys as C#
object fields defined at compile time.

As Lucas describes in his blog posting, what you have to do in the
OnBeforeSerialize callback is to prepare the object by copying its private
non-serializable fields into public fields that the serializer can deal with,
and those fields must be given names at compile time. So for example you could
make a proxy wrapper adaptor for serializing a dictionary, that had a field
"keys" that was an array of strings, and a field "values" that was an array of
values, and in OnBeforeSerialize you copy all the keys of the dict to "keys",
and all the values of the dict to the "values" array. (There's no way this
will ever run fast, but I'm just following the API to its logical conclusion
to show how futile it is to even try.)

But then you run into the other problem of Unity serialization: no support for
polymorphic arrays or null values or even dictionaries, and especially not
polymorphic dictionaries.

So you could write out an array of "keys" since they are all the same type,
string. But each item of the "values" array would have to be the same type.
i.e. a polymorphic Dictionary<string, object> would have to write out a values
array of [new object(), new object(), new object()] which would not be very
useful, or all its values would have to be the same type like
Dictionary<string, Vector2> which would write the values out as [{x: 1, y: 2},
...], but there would be no way to express a polymorphic array. It all boils
down to having to know all possible keys of the JSON dict at compile time, and
defining C# properties of those names, because it is those properties that the
Unity serialization system loops over to get the keys of the JSON object.

Unity serialization also does not support null values in arrays, so if you
wanted to write out a field of a particular type whose value was null, it
would actually write out an default value. And if the default value was a
class that contained a possibly-null reference to its same class, like struct
Node { Node leftChild; Node rightChild; }, it would recursively write out a
whole tree of dummy default objects until it bottomed out at recursion level
7, and you'd get potentially hundreds or thousands of dummy objects. (Read
Lucas's article if you don't believe me -- Unity's serialization system is
really brain damaged!)

So trying to piggyback on top of Unity's serialization system isn't such a
good idea if you want to handle reading and writing arbitrary JSON structures,
which I do.

Also Unity serialization does not even directly support writing out
dictionaries, let alone dictionaries with polymorphic values.

I wish I could see the source code, so it was easier to analyze and describe
its limitation, and I think there's a lot of value in using an open source
library like Full Serializer where we have access to its complete source code
and can understand and fix its problems, than flying blind with Unity's
proprietary serialization system and proprietary JSON serializer that rides on
top of it.

Again, here's Lucas's blog post, which the "FUBAR" reddit posting refers to:
[https://blogs.unity3d.com/2014/06/24/serialization-in-
unity/](https://blogs.unity3d.com/2014/06/24/serialization-in-unity/) \--
scroll down to the end where it gets really ugly, and read the footnote at the
bottom and discussion in the comments, where he explains how his example would
have sent Unity into an endless loop 5 + 2 = 7 years ago, but now it bottoms
out at the arbitrary depth of 7 levels of recursion, resulting in only
hundreds of dummy objects. The cycle that causes trouble he's referring to
isn't just a cycle in the graph of objects, it's actually a cycle where a
class member refers to any object of the same class! (Like a tree of nodes).

Lucas described how hard this problem is and how deeply it is embedded in
Unity's built-in presumptions:

"LUCAS MEIJER JUNE 25, 2014 AT 2:44 PM / "

"@all: what rene said."

"it’s not technically impossible to ever support null. it’s a lot of non
trivial work though. We’ have to somehow serialized “inline objects” with a
bool wether or not this one is null. it affects how you interact with such
objects with the SerializerProperty class, as well as the prefab system. (if
the “isnull” bit is marked, but a prefab sets a value anyway, what do you do).
none of these are theoretically unsolvable. you would however still run into
the depth limit problem, because of the way we do backwards compatible
loading. When we do backwards compatible loading, we at runtime, generate a
typetree for a certain object. this concept of a typetree is actually a pretty
core one in unity, and already should give a good feeling on how many of our
systems are built around assumptions on how datalayout is static. we indeed
have the concept of a collection, but other than that that’s it. so when we
generate a typetree, we actually create an object, then we serialize it in a
special typetree creation mode. if you have class cycles, the typetree would
still grow very big. (we cap it to 7 levels too)."

"so yeah, a ton of work. up until now we have prioritized other things, and I
don’t see that changing in the near future. (I actually spent a week or two
going down this rabbit hole for both null and polymorphism when I did the
serialization improvements for 4.5, thinking I’d be able to get something in,
but I ended up with the conclusion that it would take a lot more time than
that, and that my time was better spent providing things like serialization
callbacks, and other things in Unity that I feel could really use some
loving)."

\----

Where the rubber hits the road in a typical JSON message consuming Unity app
trying to use JSONUtility, you will find that it has to actually parse the
JSON twice, first into a generic Message { string name; } msg to get the
msg.name, then switch on the msg.name to parse the JSON again into a more
specific WhateverMessage { string name; float foo; bool bar; } or whatever.

[https://docs.unity3d.com/Manual/JSONSerialization.html](https://docs.unity3d.com/Manual/JSONSerialization.html)

"Using FromJson() when the type is not known ahead of time:"

"Deserialize the JSON into a class or struct that contains ‘common’ fields,
and then use the values of those fields to work out what actual type you want.
Then deserialize a second time into that type."

But what if you wanted a message that contained a payload of an arbitrary JSON
object or polymorphic array? Or what if you wanted to send a JSON object whose
keys were content id strings -- you would have to define a C# class with a
property for every possible content id you would ever want to put into the
dictionary! And that would be impossible to know (and impractical to
implement) beforehand.

"Supported Types

The API supports any MonoBehaviour-subclass, ScriptableObject-subclass, or
plain class/struct with the [Serializable] attribute. The object you pass in
is fed to the standard Unity serializer for processing, so the same rules and
limitations apply as they do in the Inspector; only fields are serialized, and
types like Dictionary&lt;&gt; are not supported."

"Passing other types directly to the API, for example primitive types or
arrays, is not currently supported. For now you will need to wrap such types
in a class or struct of some sort."

What you need to represent arbitrary JSON data at runtime is a variant object
like fsData. Here's what fsData looks like -- it only has a private object
_value member, whose type it figures out at runtime. For JSON objects, its
value is a Dictionary<string, fsData>, and for JSON arrays, its value is a
List<fsData>.

[https://github.com/jacobdufault/fullserializer/blob/master/A...](https://github.com/jacobdufault/fullserializer/blob/master/Assets/FullSerializer/Source/fsData.cs)

But there is no way for the Unity serializer to map between JSON objects and a
variant type with a polymorphic object field, or even a dictionary like
Dictionary<string, fsData> ... fsData's "private object _value" field is
invisible to the serializer so any fsData will get serialized as {} (not just
because _value is private, but because its type "object" doesn't qualify for
serialization):

Q:

What does a field of my script need to be in order to be serialized?

A:

\+ Be public, or have [SerializeField] attribute

\+ Not be static

\+ Not be const

\+ Not be readonly

\+ The fieldtype needs to be of a type that we can serialize.

Q:

Which fieldtypes can we serialize?

A:

\+ Custom non abstract classes with [Serializable] attribute.

\+ Custom structs with [Serializable] attribute. (new in Unity4.5)

\+ References to objects that derive from UntiyEngine.Object

\+ Primitive data types (int,float,double,bool,string,etc)

\+ Array of a fieldtype we can serialize

\+ List<T> of a fieldtype we can serialize (edited)

------
foota
It seems a bit silly to me that the author uses number of method calls (which
is only like 4?) and lines of codes as a metric for how slow the function is.

~~~
recursive
He also uses time.

~~~
taspeotis
And other complementary metrics, like memory allocations.

