Cache size
The new caching feature is great! With this, I could abandon downloading/mounting VHD files and keep the software I need for my tests in the cache. Unfortunately, 100 MB is not enough for everything -- I need something in the order of 2-3 GB for everything (LaTeX, toolchain, R in different versions, ...). Also, I doubt that extracting the files from the cache will be faster than my current approach.
I'm sure you have considered a per-project VHD with write access that is mounted into the workers. This might reduce build time. What were the reasons against this strategy?
Comments are currently closed for this discussion. You can start a new one.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Feodor Fitsner on 23 Oct, 2014 07:20 PM
The problem with VHD is that it could be mounted to a single VM only at the given point of time which is an issue for multi-job builds running in parallel.
I agree putting a lot into cache could add more overhead for transferring archive and unpacking it.
-Feodor
2 Posted by Kirill Müller on 23 Oct, 2014 07:34 PM
Thanks. I see your point. Differential VHDs, first-come first-serve for writing, the other VMs' caches get reverted? Don't you use something similar for the system images? (Just thinking out loud here, probably it's not as easy as it looks... ;-) )
Keep up the good work!
Support Staff 3 Posted by Feodor Fitsner on 23 Oct, 2014 10:42 PM
On Azure we just copy entire VHD to provision a new VM. Apparently, copying 127 GB takes only few seconds there!
Ilya Finkelshteyn closed this discussion on 25 Aug, 2018 01:49 AM.