■ I benchmarked ABCI_MP on the following three PCs with different numbers of
CPU cores by running a sample file called LEP.abc:
Intel Core 2 Duo E6600 @2.4GHz: Windows XP: 2GB RAM CPU time used=21.7s
Intel Core 2 Quad Q6600 @2.4GHz: Windows XP: 4GB RAM CPU time used =13.3s
Intel Core i7 920 @2.67GHz: 64-bit Windows Vista: 3GB RAM CPU time used =5.1s (ABCI_MP 32-bit version)
Intel Core i7 920 @2.67GHz: 64-bit Windows Vista: 3GB RAM CPU time used =4.6s (ABCI_MP 64-bit version)
1. Performance of ABCI_MP scales almost linearly with the number of CPU cores.
2. ABCI_MP 64-bit version runs 10% faster than the 32-bit version on 64-bit Windows Vista.
Running ABCI_MP 64-bit version on Windows Vista 64-bit and Corei7 looks very promising.
■ Now, GUI for ABCI to PyPi is available and ABCI can be easily installed on GNU/Linux and Windows,
thanks to Dr. Sergey V. Matsievskiy. Installation guide is available here.
■ Now, the Linux versions of ABCI_MP are available, thanks to Drs. Yong-Chul Chae and Xiaowei Dong of
and Dr. Jonathan Smith of
visit the installation guide page:
■ Now, the Windows XP 64 bit version of ABCI_MP is also available. It is here:
Dear ABCI users,
ABCI_MP was updated to version 12.5. This version fixed some small bugs for users who want to
use a large amount of meshes. If you do not belong to this category, you may keep using the version 12.3,
which is also included in the package.
It is here:
ABCI_MP was updated to ABCI_MP_12.3 (version 12.3). The new or improved features include
the transverse extension of Napoly integral (derived by Shobuda) so that ABCI can now handle
calculations of transverse wake potentials in structures having unequal tube radii at the two sides,
still keeping the integration path confined to a finite length by having the integration contour beginning
and ending on the beam tubes. More details are described in the paper THPAN036 presented at PAC07.
Improvement of the open boundary condition. ABCI used to adopt the conventional open
boundary condition where all waves propagating in the beam pipe are assumed to have the phase
velocity equal to the speed of light. But in general cases, the propagating fields can be represented
as a linear superposition of the waveguide modes and each mode has its own phase velocity which
varies in frequency. Aharonian et al. introduced a more advanced formula for the open boundary
conditions in the DBCI code and ABCI now adopts it. In this method, the phase velocities of
all the travelling waveguide modes are represented correctly in the code.
ABCI_MP supports parallel processing in OpenMP for shared-memory computers,
namely a PC with several CPUs (e.g., 8 AMD Opterons) or a CPU with multiple cores
(e.g., Intel Core2Duo), which share the same memory. It also supports multi-threaded
shared-memory system. Tests with a Core2Duo PC (two cores) show that ABCI_MP is
about 1.7 times faster than a non-parallelized ABCI. ABCI_MP also adopts the dynamic
memory allocation for nearly all arrays for field calculations so that the amount of memory
needed for a run is determined dynamically during runtime. You can use any number of meshes
as far as the total allocated memory is within a physical memory of your PC (if it exceeds,
the ABCI_MP starts to access a hard disk and the computation will be slowed down severely,
although it still runs). The size of physical memory needed to run a job with N million meshes is
DOS window so that you can see the progress of a job on screen.
The MPI version of ABCI is also under consideration.
Inside, there are four folders and their names tell you what they are.
You can place the application programs anywhere.
You can also place the input file folder anywhere, not necessary in the same folder
as ABCI executable modules. No installation of the program is necessary.
The recommended use may be as follows:
1. Place the ABCI application folder in the Program Files folder.
2. Then, create the short-cut(s) of the ABCI executable module(s) on the desktop.
3. Place the input file folder anywhere.
4. To run ABCI, you just drag and drop the input file (such as sample1.abc) on top
of the ABCI short-cut, and then the DOS window appears, saying that ABCI is now running.
5. When the computation ends, you will find the output files in the same folder as
the input file. Very simple, is it not?
You should better assign Notepad or Wordpad programs to open these files.
The alternative way is to double click one of the ABCI applications, and the DOS window
appears, asking the name of the input file. If the input file is located in the same folder
as the ABCI executable module, you just type its name such as sample1.abc. If not,
you have to give the full path-name (for example, C:\Input_files\sample1.abc).
The recommended way looks more convenient to me.
I enclose the TopDrawer program for Windows. You just drag and drop a
TopDrawer file on it and then all figures pop up. If you want to convert them to
postscript files, just right click the figure and choose PostScript option.
You should better install Ghostscript and Ghostview programs to view and print
postscript files from here:
If you need to compile the ABCI source code using the Compaq Visual Fortran,
you need to add /fpscomp:filesfromcmd option in the compiler so that the command-line DOS
window appear to ask you the input file name at run time. The Windows version of ABCI codes are
the stand-alone versions, and there is no need to link with other subroutines.
The Windows version of ABCI is very fast, and together with TopDrawer for Windows,
you can do all necessary jobs only on Windows.
This information and the ABCI programs are free to use, but it is not Open Source. If you find someone
who want to use it, just tell him the above URL to download the package.
The animation of the electric fields in the KEK ARES cavity is available now to download.
But, watch out that the file is big (96MB):
Dr. Iker Rodriguez has kindly provided us two compressed files for the ARES cavity animation:
Intel Indeo codec 5.10 version (1.7MB):
DiVX version (2.6MB):
Please give me your comments.
Yong Ho Chin
All questions regarding to this home page should be addressed to
Yong Ho Chin (firstname.lastname@example.org)
Last updated on May 13, 2009 by Y. H. Chin