From the course: Refactoring with GitHub Copilot

Refactoring for performance - Github Copilot Tutorial

From the course: Refactoring with GitHub Copilot

Refactoring for performance

By now you've probably noticed a pattern; identify an issue in code and ask Copilot about it. Here's a check for duplicates in an array. It's a slow implementation. It iterates over the array, and during each step, iterates over all items to see if there's a match. On line 6, the second part of the condition is iterating the list of duplicates. So this just won't scale. All right, Copilot, let us know what you think. Improve performance. Of course, Copilot is not only going to give me an improved algorithm, it's also going to explain why. If you aren't familiar, a set stores unique values. So the collection will grow slower in cases of the original array has lots of duplicate items. As you can see, this has turned into one for loop rather than two. Also, with that duplicates.include call. So that's all well and good. But this refactoring is very specific to the example implementation. What can we do about application-level performance? Well, one of the first things we do in web development is turn to caching. If this set or any other data needs to be accessed multiple times during a request or on subsequent requests, so I'll ask, how could I cache the findDuplicates result set? And Copilot provides an example answer. One thing I want to call out is that if you are implementing cache in an application, you should do it application-wide and not at a function level like this. There are use cases at the function level. You'll need to decide. Now, as usual, Copilot provided a huge chunk of code for us to work with. Some of it needs to be refactored. The responsibility, as always, falls on you to make sure that the implementation is appropriate.

Contents