From the course: Refactoring with GitHub Copilot
Lower TCO on the project
Let me know if this sounds familiar to you. You have a local instance of a project. There is one command or database query that's slow on your local, so you trim down the data you're working with so you can validate your work more quickly. But stop for a moment, if it's slow on your local, how does it work on production? Use-case-specific servers are much more powerful than your laptop, but it's still clear there's a problem. The first thing I want to call out is that servers cost money, and reducing CPU time lowers server utilization. In the video about refactoring performance, I showed this example. One of the things I suggested was asking Copilot how to refactor for performance. If you have a slow script, command, or task, I'd start there, asking how to make it faster. Reducing CPU time and server time is something tangible to show management at a refactoring. Less tangible is dev time. Here is a function to sort users by purchases. I guess an immediate refactor would be to call them customers, but I digress. This creates a temporary array. Then it uses a bubble sort which is fast in smaller datasets. From that point, it creates a new array based on the sort order. It returns the sorted users. It's decently commented, but it's a lot of code. Now check out this implementation. This uses the core PHP function you sort. This implements quick sort. It might be a bit less performant than bubble sort for smaller data sets, but which one do you want the new dev on your team digging through for a bug fix? Which is likelier to have a bug. When weighing the total cost of ownership to justify refactoring, dev time needs to be the first consideration. A potential tiny increase in CPU time here, will prevent hours of debugging in the future.