High CPU usage on Windows 10


I’m using Atom 1.19.4 on Windows 10 x64 and have noticed very high CPU usage that appears to be related to the size of the project that is open.

C:\>atom --version
Atom    : 1.19.4
Electron: 1.6.9
Chrome  : 56.0.2924.87
Node    : 7.4.0

The following observations were done with Atom in safe mode (started with the --safe flag) which to the best of my understanding means no community or other non-core packages are loaded. (I am a technical writer, not a developer, so if I am overlooking obvious things to attempt or processes to observe, that’s probably why.)

I have a rather large project (a web site, actually) with many files and a deep directory structure. When this project is open, Atom’s CPU usage fluctuates between 40-50% – with zero files open for editing. From the task manager:


When I load a small project (in this case the project associated with my Atom configuration – the one that opens when I click “Open Config Folder” in the Settings panel), Atom’s CPU usage is effectively zero. There are a few small transitive CPU percentage values shown in the Task Manager, but mostly the entries show 0%. (Apparently I can included only one image in this post, but the screenshot is similar to the above but with all zeros.)

Note that the memory usage is similar with the large and small projects.

Note also that running Atom in normal mode (that is, with the non-core packages loaded) does not change the CPU usage percentages I see in a noticeable way.

I ran a CPU profile, but don’t see anything obviously amiss, but I don’t really know what to look for there. I can forward the saved .cpuprofile file if it’s useful, but I don’t see a way to attach it here.

There are a number of “high CPU usage” topics in the support archive, but none seem to be resolved and none appear to correlate CPU usage with project size. If there is other diagnostic information I can provide, I’m happy to try to do so. I would like to be able to switch from using the Eclipse-based Aptana Studio to Atom, but this issue with the big project is a major roadblock for me.

Thanks in advance for any help,


Wonder if this is related to filling and maintaining the tree view. Does it happen if you run atom --safe?


Yes, the observations in the original message were made in safe mode. My first instinct was that the high CPU usage was related to my use of the autocomplete-paths package (because of the depth of the paths in this project, and the large number of files), but the CPU usage doesn’t seem to be affected noticeably by running with or without the --safe flag.


Is the project a git repo or is it inside a git repo? One of the possible culprits is the core package github, which has been implicated in slowdowns and high resource usage in some cases.


No, neither project in the original post is itself a git repo or within one. Although the smaller project (my Atom config folder) does have a .gitignore file for some reason.

Playing around, I just opened a (not particularly large) git repo. Windows Task Manager shows low CPU usage when only the git repo project is open.

I should note that even when I have the large project open and Atom is using 50% of the CPU cycles, the Atom interface itself is still reasonably performant. Operations do sometimes seem slow, but that could be network slowness.

Ahhhhh . . . the large project that has high CPU usage is on an SMB-mounted network share. I’m not sure why/how that would affect the CPU usage, but maybe it’s a clue.


A big clue. Atom does a lot of checking of files in projects. I’m not very educated about networking, but it seems plausible that having to send a bunch of requests through to a foreign computer and wait until they’re replied to could create more work for the same procedures. That would be amplified by the amount of files to keep tabs on.


So, this got me curious, so I tried a small test. I made two new projects with identical content, one residing on the network server and one residing on my local SSD. Both projects contain 1483 items, which is probably not enough for a good test, but it also didn’t take me long to copy over.

According to the Windows task manager, Atom’s CPU usage when the project with its files on the network server was open topped out at around 1%, but was generally barely noticeable.

Atom’s CPU usage when the project with its files stored locally was slightly lower; the peak usage I saw was 0.6%.

So . . . it’s hard to tell. With the very large project (hundreds of thousands of files in the tree), perhaps a small difference due to network access adds up. Or maybe it’s something else entirely.

If this seems like a profitable avenue for more exploration and someone can suggest a more rigorous test (short of making a local copy of the giant project), I’d be happy to give it a go. I’d really like to find a solution (or at least a workaround) for this.


This may or may not be useful, but the atom-commander package can use SSH or FTP to connect to a foreign server and allows you to edit files on that server, downloading files to its cache temporarily when you want to edit them without downloading them to your machine. Since it uses Docks, you can put it in the place of the tree view and access files that way. It also has the ability to compare folders or files, so if you download a small part of your very large project, you can easily see which components of that project are mirrored locally.

Is this something you’re working on mostly alone, or with other people?


Thanks for the atom-commander suggestion – it looks like it might have some useful features for my workflow.

The different parts of the large project in question are edited regularly by numerous people, a significant number of whom don’t use source control as a part of their everyday workflows. This introduces some obvious issues, which we’ve mostly worked around sufficiently well – but it means that having multiple copies of things floating around the network becomes problematic. At best, my having local copies of files introduces a merge step for me that I would rather not add. Editing directly on the network file server is both the path of least resistance and (as of today) the best alternative we know of. The setup has been functioning smoothly for quite some time; the new factor in the mix is me trying to swap in Atom in place of Eclipse as my main editing environment.


It’s probably safe to assume that the files you’re working on at any given time aren’t being worked on simultaneously by someone else without your knowledge. If that’s not the case, then you probably need source control, regardless of whether other people don’t want to use it. If it is the case that you have complete control over the files you’re editing, then you could set up a system where you “check out” a particular file or directory and work with just that on your machine. remote-sync is another FTP/SFTP package, designed for cases where whole folders need to be passively synchronized. It lets you point to specific directories, upload on save, and you get to be just as specific regarding directories as if you were actually using bare scp (but without the pain of typing out scp commands).

remote-ftp is also pretty popular, with a tree view-like display and strong config settings, but fewer bells and whistles than remote-sync or atom-commander. I feel like remote-edit has the worst form factor of the packages, because I don’t want to navigate remote files in a modal that disappears as soon as I click anywhere else.

Side note: While the proliferation of packages that do very similar things can be a little overwhelming, it’s actually one of the things that I like about Atom. I use atom-commander myself, because it gives me everything I need in the way I want to use it. remote-sync, remote-ftp, and remote-edit are all fine, but they operate on slightly different theories. I like that better than having a single (S)FTP package, because there’s something for everyone.


Your assumption is correct; actual editing collisions (with two people having the same file open at one time) are vanishingly rare, which is one reason we continue without code-style source control for a big swath of files. Not ideal, but working.

I’ve tried remote-ftp for a different use-case and at first blush it seems to meet my needs, but atom-commander might be even better.

And yes, the variety of community-developed packages is one of the things that I find attractive about Atom. The Eclipse world is much more insular and slow-moving, and the particular Eclipse variant I’ve been using (Aptana Studio), while it’s been great for my needs, is no longer actively developed.


Just a comment: it is far, far easier to develop an Atom extension than an Eclipse one. Eclipse takes the tendency of the Java community to overbuild to an absurd conclusion, and just creating an extension is a challenge, even with a lot of editor support. No idea what it takes to get “hello, world” working as I never bothered to get that far…


I’m a hobbyist who will take months or years between projects and then gets lured back in by the fun puzzle of learning to use another framework, which I inevitably wander away from when a new thing attracts my obsession, and it was so easy for me to understand how Atom and its packages fit together and how I could change the behavior to fit my needs. Atom’s flaws are mostly things I don’t care about because my needs don’t intersect with its weaknesses, and its strength is that I can make it as many things as I want to experiment with.