[Bioperl-l] CVS tags for Ensembl

hilmar.lapp@pharma.Novartis.com hilmar.lapp@pharma.Novartis.com
Thu, 12 Oct 2000 23:29:08 +0100

Is there a downside for those without an Ensembl hat on?

The only thing I can think of is cluttering CVS tags. If there are going to
be hundreds of Ensembl tags...


Ewan Birney <birney@ebi.ac.uk>@bioperl.org on 12.10.2000 19:03:19

Sent by:  bioperl-l-admin@bioperl.org

To:   bioperl-l@bioperl.org
Subject:  [Bioperl-l] CVS tags for Ensembl

I guess this is a request with my "Ensembl" hat on to the Bioperl
developers, at which point I'd better claim that I am not a bioperl
developer ;)

Is it ok if inside Bioperl we put in cvs tags for the Ensembl releases?

We want to put a cvs tag down across all source code associated
with a release for a dataset from Ensembl. This, of course, includes

The tag will be


for example

What do people think? Is this ok?

Ewan Birney. Mobile: +44 (0)7970 151230, Work: +44 1223 494420

Bioperl-l mailing list

Received: from Lists.Uni-Bielefeld.DE (IDENT:0@pan.hrz.uni-bielefeld.de [])
	by pw600a.bioperl.org (8.9.3/8.9.3) with ESMTP id PAA24917
	for <bioperl-l@bioperl.org>; Thu, 12 Oct 2000 15:32:37 -0400
Received: from lrz.uni-muenchen.de (root@f-84.munchen.ipdial.viaginterkom.de [])
	by Lists.Uni-Bielefeld.DE (8.8.6 (PHNE_17135)/8.8.6) with ESMTP id WAA09145
	for <vsns-bcd-perl@lists.uni-bielefeld.de>; Thu, 12 Oct 2000 22:36:34 +0200 (METDST)
Received: (from eugene.leitl@localhost)
	by lrz.uni-muenchen.de (8.8.8/8.8.8) id WAA22629;
	Thu, 12 Oct 2000 22:35:39 -0700
From: Eugene Leitl <eugene.leitl@lrz.uni-muenchen.de>
MIME-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
Message-ID: <14822.40745.837305.996203@lrz.uni-muenchen.de>
Date: Thu, 12 Oct 2000 22:35:37 -0700 (PDT)
To: <vsns-bcd-perl@lists.uni-bielefeld.de>, <biopython@biopython.org>
X-Mailer: VM 6.71 under 21.1 (patch 4) "Arches" XEmacs Lucid
Subject: [Bioperl-l] OFFTOPIC: jobs in structural bioinformatics?
Sender: bioperl-l-admin@bioperl.org
Errors-To: bioperl-l-admin@bioperl.org
X-BeenThere: bioperl-l@bioperl.org
X-Mailman-Version: 2.0beta2
Precedence: bulk
List-Id: Bioperl Project Discussion List <bioperl-l.bioperl.org>

Right now I'm at a fork in my career, casting around either to go on
to do a Ph.D. (brute force MD protein folding with integer lattice
gases on a Beowulf) or to apply in the industry (structural
bioinformatics, virtual screening, and the like).

Sorry for contributing noise to an usually refreshingly technical
list, but I'd be thankful for any pointers (whether academical or
industry). Below is the piece of text I'm floating around on diverse
channels, looking for a Ph.D.

Once again, sorry for the noise.


Eugene Leitl


I'm looking for a Ph.D. position in brute force MD approach to the
Protein Folding Problem, especially using massive parallelism (spatial
decomposition of the simulation box over a large number (10^3..10^6)
of computational nodes) and novel algorithms (integer lattice gases
for forcefield rendering, including long-range forces).

Some of the research topics I intend to focus on:

 * Development of algorithms for efficient search of configuration

 * Simulation of long-time scale events

 * Testing and improvement of molecular models and force fields by
   comparison of simulation results to experimental data

 * Simplification of force fields

 * application of the above to protein engineering and de novo design

I think that integer lattice gas algorithms simultaneously fill
several of these slots, since scaling in ~O(const) with number of 3d
lattice locally interconnected Beowulf-type nodes while being message
passing matrix latency tolerant. They can also profit from SIMD in a
register (MMX-type) parallelism of recent CPU architectures. Also they
tend to reduce cache misses both for code and data and spend most of
the time streaming through memory in sequential order, thus profiting
from burst mode of modern SDRAM memories.

This performance advantage should allow to investigate mesoscale or
long time scale phenomena on Beowulfish COTS-type hardware without the
Blue Gene price tag. (Of course, such algorithms can also profit from
massively parallel custom hardware).

Lattice gas data structures are suitable for realtime volume
visualization via the voxel paradigm [1], allowing to create fully
interactive simulations (dynamically adding constraints) and are very
useful for identifying and interactive segmentation of interesting
features in a potentially very large volume simulation box [2].

Apart from visualization and speedup, integer lattice gases appear
useful for fitting forcefields to empirical data (using evolutionary
algorithms to select the forcefield which can restore native fold of
randomly distorted Brookhaven database structures from a population of
forcefield individua). Learning from QM results should work via the
same route.

And these algorithms are, well, fundamentally simple, since shifting
forcefield complexity from code to the number arrays in the
(interpolating) lookup tables. This also makes them GA-friendly.

Since those early lattice gas CFD results, there has been some slow,
but steady progress in lattice gas codes [3]. Also, there are theoretical
reasons to believe that their perceived limitations are not
fundamental. Furthermore, there is a trend in MD to converge toward
the cellular automata way of doing things [4]

I'm looking for a Ph.D. position (preferably in Europe) exploring some
of these ideas, currently writing a research proposal further
explaining the details of the project. I would very welcome your
critical comments on this work in progress (not yet online), and, of
course, would be very interested if there is a Ph.D. position vacancy
in your research group. I would love to drop by from an interview. My
CV and my resume can be found on my web site:


Finally, I would like to apologize for this lengthy, and somwhat
crammed message which was necessary to explain this somewhat strange
approach to the PFP.


Eugene Leitl

[1] http://www-graphics.stanford.edu/software/volpack/

[2] http://linux-green.lanl.gov/~pxl/papers/sc96/INDEX.HTM

[3] http://physics.bu.edu/~bruceb/MolSim/

[4] http://linux-green.lanl.gov/~pxl/papers/par_md.ps