Development Roadmap

#1

Below is a roadmap of important features that we are planning to implement. Not everything is included here, just the ones we are likely to implement in the not-so-distant future.

Order and priority is subject to change. If you want to see new features not listed here, or would like features further down the pipeline given higher priority, don’t hesitate to tell us. We can also add a poll to help determine which features are more important for users.

Roadmap:

  • Support interactive sessions using multiple hosts (in progress)
  • Array/Group Sections (allow invoking methods on a subset of members of a collection using slicing notation) (in progress). This is implemented on a branch leveraging ckmulticast functionality of Charm++. We are working on merging this, and there are also other improvements planned.
  • Support Jupyter notebooks
  • Expose remaining Charm++ features (quiescence detection, TRAM, etc)
  • Highly scalable parallel map framework (see experimental charm.pool in recent releases)
  • Global (distributed) NumPy arrays (i.e. support NumPy API, including arrays, access/modify specific elements, array operations, etc. for distributed arrays that don’t fit on a single host). Possibly something similar with other popular data structures like pandas dataframes.
  • Improve use of RDMA features of Charm++. Make transparent use of RDMA (especially for large arrays).
  • Simplified API for alternative abstractions running on top of charm like MapReduce, BSP model, task scheduling
pinned #2
#3

Hi Juan!

It seems like you guys are doing great work with charm4py, it is a very exciting project. Looking at the roadmap, it would be great from our perspective to be able to interoperate between charm4py and mpi4py as soon as possible in order to make use of libraries such as pyFFTW that would accelerate our developments enormously.

Thanks for your efforts

#4

Thanks a lot for the feedback.

Someone else has also expressed interest in charm4py and mpi4py interoperation. We haven’t started work on that yet, but I’ll make it a priority to work on it next.

Charm++ can already interoperate with MPI, so doing the same thing with charm4py and mpi4py hopefully doesn’t require much work.

#5

Thanks! That would be most helpful, it would ease our development enormously.

#6

@ccueto I am continuing the discussion of MPI interoperability here. I have posted an example there showing a working hybrid mpi4py/charm4py program. Hope this helps.