I was wondering if anyone has heard or knows if there are new Hardware Requirements/Considerations for the host machine when using the Application Packs feature. I have been testing and using the Application Packs Beta over the past few weeks and I've just notice some patterns and a new hardware strain while using this feature. The new hardware strain that brought me to ask this question is very high usage of the host's memory. My experience of this was earlier today when we had to push a large piece of software to our Interior Design Computer Lab. This was our first time pushing a package to multiple computers at once. When the computers were selected and we deployed the package, we notice something strange. It would push the package to only a few computers in the lab, and once the package was receive and installation began, it would start transferring the package to another small group of computers. Meanwhile while this happened, I saw expected high disk usage and network transfers, but what caught me slightly off guard was almost all my memory was being used up. SmartDeploy was using anywhere from 8 to 12 GBs of the 16GB in my system during the deployment. Everything was deployed and the software install with no issues on all the machines. However, it did take an hour to almost two hours to completely deploy to the lab. So, here are a few questions I have relate to this.
Are there new Hardware Requirements/Recommendations for those who plan to use this feature moving on?
Does the size of Application Packs reflect on the amount of memory used during deployment?
When deploying to a group of computers, is there a predetermined amount of computers that receive that pack at once, or is the number determined by the size of the pack and/or the amount of memory in the host system?
When deploy an Application Pack to a group of computers, does client get it's own session, like a unicast, or do they get the same packets at the same time, like a multicast?