Performing Various Iteration Methods in .NET


Application Security Testing: An Integral Part of DevOps

Environment: .NET


I've been implementing numerical libraries in .NET and have come to some conclusions about iteration performance. My classes have to hold a large amount of data and be able to iterate through that data as quickly as possible. To compare various methods, I created a simple class called Data that encapsulates an array of doubles.

Method #1: Enumeration

Data implements IEnumerable. It contains GetEnumerator, which returns its own DataEnumerator, an inner class.

   public IEnumerator GetEnumerator()
      return new DataEnumerator( this );

    internal class DataEnumerator : IEnumerator
      private Data internal_ = null;
      private int index = -1;

      public DataEnumerator( Data data )
        internal_ = data;

      public object Current
          return internal_.Array[index];

      public bool MoveNext()
        if ( index >= internal_.Array.Length )
          return false;
        return true;

      public void Reset()
        index = -1;

Method #2: Indexing

I implemented an index operator on the class, which simply calls the index operator on the array.

    public double this[int position]
        return array_[position];

Method #3: Indirect Array

I created a property to access the array.

    public double[] Array
        return array_;

When iterating, I called the Array property and then its index operator.

          d = data.Array[j];

Method #4: Direct Array

I created a reference to the array.

        double[] array = data.Array;

Then, I iterate through that reference.

          d = array[j];

Method #5: Pointer Math

Finally, I tried improving performance by iterating through the array in Managed C++, using pointer manipulation.

        static void iterate( Data& data )
          double d;
          double __pin* ptr = &( data.Array[0] );
          for ( int i = 0; i < data.Array.Length; i++ )
            d = *ptr;

I called it this way:

        Pointer.iterate( data );


To test the different methods, I allocated 1,000,000 doubles into an array and indexed over all of them. I repeated this 1,000 times to minimize randomness. Here are the results...

Click here for a larger image.

Enumeration is always slow. That's not surprising because I'm using a general data structure to hold the doubles. Each access performs a cast. The three operator/property methods differed very slightly. These are probably all optimized similarly. Using pointer math to traverse over the raw data was significantly faster. This is probably due to the fact that there's no bounds checking.

In summary, if you have large amounts of data and performance is critical, consider using managed C++.


Thanks to Mark Vulfson of ProWorks for tips on using the Flipper Graph Control. Also, to my colleagues Ken Baldwin and Steve Sneller at CenterSpace Software.

About the Author

Trevor Misfeldt is the co-founder and CEO of CenterSpace Software, which specializes in .NET numerical method libraries. Trevor has worked as a software engineer for eight years. He has held demanding positions for a variety of firms using C++, Java, .NET, and other technologies, including Rogue Wave Software, CleverSet Inc., and ProWorks. He is coauthor of Elements of Java Style, published by Cambridge University Press, and is currently working on a follow-up book for C++. He has also served on a course advisory board of the University of Washington. His teams have won the JavaWorld "GUI Product of the Year" and XML Magazine "Product of the Year" awards. Trevor holds a BSc in Computer Science from the University of British Columbia and a BA in Economics from the University of Western Ontario.


Download source - 6 Kb


  • There are no comments yet. Be the first to comment!

  • You must have javascript enabled in order to post comments.

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • As all sorts of data becomes available for storage, analysis and retrieval - so called 'Big Data' - there are potentially huge benefits, but equally huge challenges...
  • The agile organization needs knowledge to act on, quickly and effectively. Though many organizations are clamouring for "Big Data", not nearly as many know what to do with it...
  • Cloud-based integration solutions can be confusing. Adding to the confusion are the multiple ways IT departments can deliver such integration...

Most Popular Programming Stories

More for Developers

RSS Feeds

Thanks for your registration, follow us on our social networks to keep up-to-date
We have made updates to our Privacy Policy to reflect the implementation of the General Data Protection Regulation.