Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPI on SMC #24

Open
cswiercz opened this issue May 10, 2016 · 0 comments
Open

MPI on SMC #24

cswiercz opened this issue May 10, 2016 · 0 comments

Comments

@cswiercz
Copy link
Member

...from a course announcement.

Number of Cores on SMC

Currently, each student should have access to four CPUs on SMC. These CPUs are dynamically allocated to a student project when parallel code is executed so you if you're interested in getting somewhat accurate timing results you should try running your OpenMP and MPI code multiple times. This is, in part, because of the overhead in trying to find available CPUs for your processes on SMC as well as the fact that many other people are also using SMC and those cores may not be available.

Note that the more traffic on SMC the more likely your code will run slowly during timing tests. Also note that traffic tends to increase closer to homework deadlines.

The final disclaimer is that x4 CPUs / Student * 100 Students = 400 CPUs. The SMC server on which our projects live doesn't actually have 400 CPUs so we'll have to share. Worst case scenario (and I hope this doesn't happen) is that we'l have to reduce the number of available CPUS per student to two.

Using MPI on SMC

Already some enterprising students are trying out MPI code on SMC. That's great! They also uncovered some minor, now resolved, bugs in the SMC environment that, I swear, were not there several weeks ago when I tested out MPI on SMC.

For now, instead of running

$ mpiexec -n 4 ./a.out  # or whatever number of processes and executable

instead run

$ mpiexec.mpich -n 4 ./a.out

To briefly explain: Ubuntu's default OpenMPI version has some bugs. We'll use Argonne Natl' Lab's "mpich", instead, which is installed on SMC but needs to explicitly called.

I was told by the SMC folks that this will be made the default at some point within our projects. I will let you know when that happens.

Chris

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant