<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://wiki.docking.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Enkhjargal</id>
	<title>DISI - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="http://wiki.docking.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Enkhjargal"/>
	<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Special:Contributions/Enkhjargal"/>
	<updated>2026-04-05T09:32:59Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.1</generator>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10479</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10479"/>
		<updated>2017-12-12T22:40:30Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Look at the /nfs/db/trials/HOWTO.Enkhee&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    zinc/SQL_statement/clinical_trial.sql&lt;br /&gt;
&lt;br /&gt;
    Please see below the sql_statements&lt;br /&gt;
&lt;br /&gt;
 3. Load raw data to the database tables:&lt;br /&gt;
    If you want to delete all the existing data, then use --wipe&lt;br /&gt;
    Otherwise it will load data incrementally to the existing data.&lt;br /&gt;
&lt;br /&gt;
    source /nfs/soft/www/apps/zinc15/envs/dev/bin/activate&lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_ct /nfs/db/trials/important/studies.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_condition /nfs/db/trials/important/browse_conditions.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_intervention /nfs/db/trials/important/interventions.txt&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  ------------------------   zinc/SQL_statement/clinical_trial.sql  ----------------------------------------&lt;br /&gt;
&lt;br /&gt;
 1. Creating the scheme and give the grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create schema clinical2;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to root;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincread;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincfree;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to test;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to admin;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to adminprivate;&lt;br /&gt;
&lt;br /&gt;
 2. Creating the tables and give grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create table clinical2.ctstatus (like clinical1.ctstatus including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctstatus FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ctphase (like clinical1.ctphase including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctphase FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2 (like clinical1.ct2 including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctphase_fk_fkey foreign key (ctphase_fk) references clinical2.ctphase(ctphase_id);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctstatus_fk_fkey foreign key (ctstatus_fk) references clinical2.ctstatus(ctstatus_id);&lt;br /&gt;
  alter table clinical2.ct2 add column changed_date date;&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2 FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condclass (like clinical1.ct2condclass including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condclass FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condition (like clinical1.ct2condition including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2condition add constraint ct2condition_condclass_fk_fkey foreign key (condclass_fk) references clinical2.ct2condclass(ct2condclass_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condition FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2tocond (like clinical1.ct2tocond including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2condition_fk_fkey foreign key (ct2condition_fk) references clinical2.ct2condition(ct2condition_id);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2_fk_fkey foreign key (ct2_fk) references clinical2.ct2(ct2_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2tocond FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2int (like clinical1.ct2int including defaults including constraints including indexes);&lt;br /&gt;
  alter tabel clinical2.ct2int drop column ct2_fk;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2int FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2subint (like clinical1.ct2subint including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_ct2int_fk_fkey foreign key (ct2int_fk) references clinical2.ct2int(ct2int_id);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_sub_id_fk_fkey foreign key (sub_id_fk) references substance(sub_id) ON UPDATE CASCADE ON DELETE CASCADE DEFERRABLE INITIALLY DEFERRED;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2subint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2toint (ct2toint_id integer primary key serial not null,&lt;br /&gt;
                                 ct2_fk integer references clinical2.ct2,&lt;br /&gt;
                                 ct2int_fk integer references clinical2.ct2int)&lt;br /&gt;
&lt;br /&gt;
  create sequence clinical2.ct2toint_ct2toint_seq;&lt;br /&gt;
  alter table clinical2.ct2toint alter column ct2toint_id set data type not null default nextval(&#039;clinical2.ct2toint_ct2toint_seq&#039;::regclass);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint_ct2toint_seq FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO admin;&lt;br /&gt;
&lt;br /&gt;
 3. Load data from existing tables&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ctstatus select * from clinical1.ctstatus;&lt;br /&gt;
  insert into clinical2.ctphase select * from clinical1.ctphase;&lt;br /&gt;
  insert into clinical2.ct2condclass select * from clinical1.ct2condclass;&lt;br /&gt;
&lt;br /&gt;
 4. Matching the name between subname.name and intervention.name&lt;br /&gt;
 &lt;br /&gt;
  create temp table tempsubname&lt;br /&gt;
   as select sub_id_fk as sub_id_fk, who_name as name&lt;br /&gt;
      from catalog_item join catalog on (cat_id_fk=cat_id and short_name=&#039;chembl20&#039;)&lt;br /&gt;
                        join chembl20.molecule_dictionary as md on supplier_code=md.chembl_id&lt;br /&gt;
                        join chembl20.molecule_atc_classification as mac on md.molregno=mac.molregno&lt;br /&gt;
                        join chembl20.atc_classification as ac on mac.level5=ac.level5;&lt;br /&gt;
&lt;br /&gt;
 insert into tempsubname&lt;br /&gt;
   select cs.sub_id_fk, sy.synonym&lt;br /&gt;
   from catalog_substance as cs join synonym as sy on cs.cat_content_fk = sy.cat_content_fk&lt;br /&gt;
   where not exists (select 1 from tempsubname as sn where sn.sub_id_fk = cs.sub_id_fk and sn.name = sy.synonym);&lt;br /&gt;
 &lt;br /&gt;
 alter table tempsubname add column q tsquery;&lt;br /&gt;
&lt;br /&gt;
 update tempsubname as s&lt;br /&gt;
  set q=plainto_tsquery(t.name)&lt;br /&gt;
  from tempsubname as t&lt;br /&gt;
  where s.sub_id_fk=t.sub_id_fk and s.name=t.name;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  alter table clinical2.ct2int add column terms tsvector;&lt;br /&gt;
  update table clinical2.ct2int set term=to_tsvector(&#039;english&#039;, name)&lt;br /&gt;
&lt;br /&gt;
 insert into clinical2.ct2subint (sub_id_fk, ct2int_fk)&lt;br /&gt;
   select distinct sub.sub_id_fk, int.ct2int_id&lt;br /&gt;
   from clinical2.ct2int as int join tempsubname as sub on&lt;br /&gt;
                        int.terms@@sub.q&lt;br /&gt;
   where not exists (select 1 from clinical2.ct2int as int join tempsubname as sub on int.terms@@sub.q);&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 5. Warehousing&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2toint as ct2toint where&lt;br /&gt;
     ct2int.ct2int_id = ct2toint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_substances = (select&lt;br /&gt;
    count(*) from clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2int.ct2int_id = ct2subint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2tocond as ct2tocond where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_substances = (select&lt;br /&gt;
    count(distinct(ct2subint.sub_id_fk)) from clinical2.ct2tocond as ct2tocond, clinical2.ct2toint as ct2toint, clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk and ct2tocond.ct2_fk = ct2toint.ct2_fk and ct2toint.ct2int_fk=ct2subint.ct2int_fk);&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10353</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10353"/>
		<updated>2017-09-28T17:34:41Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    zinc/SQL_statement/clinical_trial.sql&lt;br /&gt;
&lt;br /&gt;
    Please see below the sql_statements&lt;br /&gt;
&lt;br /&gt;
 3. Load raw data to the database tables:&lt;br /&gt;
    If you want to delete all the existing data, then use --wipe&lt;br /&gt;
    Otherwise it will load data incrementally to the existing data.&lt;br /&gt;
&lt;br /&gt;
    source /nfs/soft/www/apps/zinc15/envs/dev/bin/activate&lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_ct /nfs/db/trials/important/studies.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_condition /nfs/db/trials/important/browse_conditions.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_intervention /nfs/db/trials/important/interventions.txt&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  ------------------------   zinc/SQL_statement/clinical_trial.sql  ----------------------------------------&lt;br /&gt;
&lt;br /&gt;
 1. Creating the scheme and give the grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create schema clinical2;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to root;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincread;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincfree;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to test;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to admin;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to adminprivate;&lt;br /&gt;
&lt;br /&gt;
 2. Creating the tables and give grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create table clinical2.ctstatus (like clinical1.ctstatus including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctstatus FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ctphase (like clinical1.ctphase including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctphase FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2 (like clinical1.ct2 including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctphase_fk_fkey foreign key (ctphase_fk) references clinical2.ctphase(ctphase_id);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctstatus_fk_fkey foreign key (ctstatus_fk) references clinical2.ctstatus(ctstatus_id);&lt;br /&gt;
  alter table clinical2.ct2 add column changed_date date;&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2 FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condclass (like clinical1.ct2condclass including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condclass FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condition (like clinical1.ct2condition including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2condition add constraint ct2condition_condclass_fk_fkey foreign key (condclass_fk) references clinical2.ct2condclass(ct2condclass_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condition FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2tocond (like clinical1.ct2tocond including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2condition_fk_fkey foreign key (ct2condition_fk) references clinical2.ct2condition(ct2condition_id);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2_fk_fkey foreign key (ct2_fk) references clinical2.ct2(ct2_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2tocond FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2int (like clinical1.ct2int including defaults including constraints including indexes);&lt;br /&gt;
  alter tabel clinical2.ct2int drop column ct2_fk;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2int FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2subint (like clinical1.ct2subint including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_ct2int_fk_fkey foreign key (ct2int_fk) references clinical2.ct2int(ct2int_id);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_sub_id_fk_fkey foreign key (sub_id_fk) references substance(sub_id) ON UPDATE CASCADE ON DELETE CASCADE DEFERRABLE INITIALLY DEFERRED;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2subint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2toint (ct2toint_id integer primary key serial not null,&lt;br /&gt;
                                 ct2_fk integer references clinical2.ct2,&lt;br /&gt;
                                 ct2int_fk integer references clinical2.ct2int)&lt;br /&gt;
&lt;br /&gt;
  create sequence clinical2.ct2toint_ct2toint_seq;&lt;br /&gt;
  alter table clinical2.ct2toint alter column ct2toint_id set data type not null default nextval(&#039;clinical2.ct2toint_ct2toint_seq&#039;::regclass);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint_ct2toint_seq FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO admin;&lt;br /&gt;
&lt;br /&gt;
 3. Load data from existing tables&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ctstatus select * from clinical1.ctstatus;&lt;br /&gt;
  insert into clinical2.ctphase select * from clinical1.ctphase;&lt;br /&gt;
  insert into clinical2.ct2condclass select * from clinical1.ct2condclass;&lt;br /&gt;
&lt;br /&gt;
 4. Matching the name between subname.name and intervention.name&lt;br /&gt;
 &lt;br /&gt;
  create temp table tempsubname&lt;br /&gt;
   as select sub_id_fk as sub_id_fk, who_name as name&lt;br /&gt;
      from catalog_item join catalog on (cat_id_fk=cat_id and short_name=&#039;chembl20&#039;)&lt;br /&gt;
                        join chembl20.molecule_dictionary as md on supplier_code=md.chembl_id&lt;br /&gt;
                        join chembl20.molecule_atc_classification as mac on md.molregno=mac.molregno&lt;br /&gt;
                        join chembl20.atc_classification as ac on mac.level5=ac.level5;&lt;br /&gt;
&lt;br /&gt;
 insert into tempsubname&lt;br /&gt;
   select cs.sub_id_fk, sy.synonym&lt;br /&gt;
   from catalog_substance as cs join synonym as sy on cs.cat_content_fk = sy.cat_content_fk&lt;br /&gt;
   where not exists (select 1 from tempsubname as sn where sn.sub_id_fk = cs.sub_id_fk and sn.name = sy.synonym);&lt;br /&gt;
 &lt;br /&gt;
 alter table tempsubname add column q tsquery;&lt;br /&gt;
&lt;br /&gt;
 update tempsubname as s&lt;br /&gt;
  set q=plainto_tsquery(t.name)&lt;br /&gt;
  from tempsubname as t&lt;br /&gt;
  where s.sub_id_fk=t.sub_id_fk and s.name=t.name;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  alter table clinical2.ct2int add column terms tsvector;&lt;br /&gt;
  update table clinical2.ct2int set term=to_tsvector(&#039;english&#039;, name)&lt;br /&gt;
&lt;br /&gt;
 insert into clinical2.ct2subint (sub_id_fk, ct2int_fk)&lt;br /&gt;
   select distinct sub.sub_id_fk, int.ct2int_id&lt;br /&gt;
   from clinical2.ct2int as int join tempsubname as sub on&lt;br /&gt;
                        int.terms@@sub.q&lt;br /&gt;
   where not exists (select 1 from clinical2.ct2int as int join tempsubname as sub on int.terms@@sub.q);&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 5. Warehousing&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2toint as ct2toint where&lt;br /&gt;
     ct2int.ct2int_id = ct2toint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_substances = (select&lt;br /&gt;
    count(*) from clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2int.ct2int_id = ct2subint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2tocond as ct2tocond where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_substances = (select&lt;br /&gt;
    count(distinct(ct2subint.sub_id_fk)) from clinical2.ct2tocond as ct2tocond, clinical2.ct2toint as ct2toint, clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk and ct2tocond.ct2_fk = ct2toint.ct2_fk and ct2toint.ct2int_fk=ct2subint.ct2int_fk);&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10284</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10284"/>
		<updated>2017-09-07T16:23:36Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    zinc/SQL_statement/clinical_trial.sql&lt;br /&gt;
&lt;br /&gt;
    Please see below the sql_statements&lt;br /&gt;
&lt;br /&gt;
 3. Load raw data to the database tables:&lt;br /&gt;
    If you want to delete all the existing data, then use --wipe&lt;br /&gt;
    Otherwise it will load data incrementally to the existing data.&lt;br /&gt;
&lt;br /&gt;
    source /nfs/soft/www/apps/zinc15/envs/dev/bin/activate&lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_ct /nfs/db/trials/important/studies.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_condition /nfs/db/trials/important/browse_conditions.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_intervention /nfs/db/trials/important/interventions.txt&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  ------------------------   zinc/SQL_statement/clinical_trial.sql  ----------------------------------------&lt;br /&gt;
&lt;br /&gt;
 1. Creating the scheme and give the grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create schema clinical2;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to root;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincread;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincfree;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to test;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to admin;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to adminprivate;&lt;br /&gt;
&lt;br /&gt;
 2. Creating the tables and give grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create table clinical2.ctstatus (like clinical1.ctstatus including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctstatus FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ctphase (like clinical1.ctphase including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctphase FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2 (like clinical1.ct2 including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctphase_fk_fkey foreign key (ctphase_fk) references clinical2.ctphase(ctphase_id);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctstatus_fk_fkey foreign key (ctstatus_fk) references clinical2.ctstatus(ctstatus_id);&lt;br /&gt;
  alter table clinical2.ct2 add column changed_date date;&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2 FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condclass (like clinical1.ct2condclass including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condclass FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condition (like clinical1.ct2condition including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2condition add constraint ct2condition_condclass_fk_fkey foreign key (condclass_fk) references clinical2.ct2condclass(ct2condclass_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condition FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2tocond (like clinical1.ct2tocond including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2condition_fk_fkey foreign key (ct2condition_fk) references clinical2.ct2condition(ct2condition_id);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2_fk_fkey foreign key (ct2_fk) references clinical2.ct2(ct2_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2tocond FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2int (like clinical1.ct2int including defaults including constraints including indexes);&lt;br /&gt;
  alter tabel clinical2.ct2int drop column ct2_fk;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2int FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2subint (like clinical1.ct2subint including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_ct2int_fk_fkey foreign key (ct2int_fk) references clinical2.ct2int(ct2int_id);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_sub_id_fk_fkey foreign key (sub_id_fk) references substance(sub_id) ON UPDATE CASCADE ON DELETE CASCADE DEFERRABLE INITIALLY DEFERRED;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2subint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2toint (ct2toint_id integer primary key serial not null,&lt;br /&gt;
                                 ct2_fk integer references clinical2.ct2,&lt;br /&gt;
                                 ct2int_fk integer references clinical2.ct2int)&lt;br /&gt;
&lt;br /&gt;
  create sequence clinical2.ct2toint_ct2toint_seq;&lt;br /&gt;
  alter table clinical2.ct2toint alter column ct2toint_id set data type not null default nextval(&#039;clinical2.ct2toint_ct2toint_seq&#039;::regclass);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint_ct2toint_seq FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO admin;&lt;br /&gt;
&lt;br /&gt;
 3. Load data from existing tables&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ctstatus select * from clinical1.ctstatus;&lt;br /&gt;
  insert into clinical2.ctphase select * from clinical1.ctphase;&lt;br /&gt;
  insert into clinical2.ct2condclass select * from clinical1.ct2condclass;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 4. Matching the name between subname.name and intervention.name&lt;br /&gt;
&lt;br /&gt;
  create temp table subname&lt;br /&gt;
    as select sub_id_fk as sub_id_fk, who_name as name&lt;br /&gt;
       from catalog_item join catalog on (cat_id_fk=cat_id and short_name=&#039;chembl20&#039;)&lt;br /&gt;
                         join chembl20.molecule_dictionary as md on supplier_code=md.chembl_id&lt;br /&gt;
                         join chembl20.molecule_atc_classification as mac on md.molregno=mac.molregno&lt;br /&gt;
                         join chembl20.atc_classification as ac on mac.level5=ac.level5;&lt;br /&gt;
&lt;br /&gt;
  insert into subname&lt;br /&gt;
    select cs.sub_id_fk, sy.synonym&lt;br /&gt;
    from catalog_substance as cs join synonym as s on cs.cat_content_fk = s.cat_content_fk&lt;br /&gt;
    where not exists (select 1 from subname as sn where sn.sub_id_fk = cs.sub_id_fk and sn.name = sy.synonym);&lt;br /&gt;
&lt;br /&gt;
  alter table subname add column q tsquery;&lt;br /&gt;
&lt;br /&gt;
  update table subname as s&lt;br /&gt;
   set q=plainto_tsquery(t.who_name)&lt;br /&gt;
   from subname as t&lt;br /&gt;
   where s.sub_id_fk=t.sub_id_fk and s.who_name=t.who_name;&lt;br /&gt;
&lt;br /&gt;
  alter table clinical2.ct2int add column terms tsvector;&lt;br /&gt;
  update table clinical2.ct2int set term=to_tsvector(&#039;english&#039;, name)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ct2subint (sub_id_fk, ct2int_fk)&lt;br /&gt;
    select distinct sub.sub_id_fk, int.ct2int_id&lt;br /&gt;
    from clinical2.ct2int as int join subname as sub on&lt;br /&gt;
                         int.terms@@sub.query;&lt;br /&gt;
&lt;br /&gt;
 5. Warehousing&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2toint as ct2toint where&lt;br /&gt;
     ct2int.ct2int_id = ct2toint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2int as ct2int set num_substances = (select&lt;br /&gt;
    count(*) from clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2int.ct2int_id = ct2subint.ct2int_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_trials = (select&lt;br /&gt;
    count(*) from clinical2.ct2tocond as ct2tocond where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk);&lt;br /&gt;
&lt;br /&gt;
  update clinical2.ct2condition as ct2cond set num_substances = (select&lt;br /&gt;
    count(distinct(ct2subint.sub_id_fk)) from clinical2.ct2tocond as ct2tocond, clinical2.ct2toint as ct2toint, clinical2.ct2subint as ct2subint where&lt;br /&gt;
     ct2cond.ct2condition_id = ct2tocond.ct2condition_fk and ct2tocond.ct2_fk = ct2toint.ct2_fk and ct2toint.ct2int_fk=ct2subint.ct2int_fk);&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10274</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10274"/>
		<updated>2017-08-24T20:18:50Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    zinc/SQL_statement/clinical_trial.sql&lt;br /&gt;
&lt;br /&gt;
    Please see below the sql_statements&lt;br /&gt;
&lt;br /&gt;
 3. Load raw data to the database tables:&lt;br /&gt;
    If you want to delete all the existing data, then use --wipe&lt;br /&gt;
    Otherwise it will load data incrementally to the existing data.&lt;br /&gt;
&lt;br /&gt;
    source /nfs/soft/www/apps/zinc15/envs/dev/bin/activate&lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_ct /nfs/db/trials/important/studies.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_condition /nfs/db/trials/important/browse_conditions.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_intervention /nfs/db/trials/important/interventions.txt&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  ------------------------   zinc/SQL_statement/clinical_trial.sql  ----------------------------------------&lt;br /&gt;
&lt;br /&gt;
 1. Creating the scheme and give the grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create schema clinical2;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to root;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincread;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to zincfree;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to test;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to admin;&lt;br /&gt;
  GRANT USAGE on schema clinical2 to adminprivate;&lt;br /&gt;
&lt;br /&gt;
 2. Creating the tables and give grants&lt;br /&gt;
&lt;br /&gt;
  BEGIN;&lt;br /&gt;
  create table clinical2.ctstatus (like clinical1.ctstatus including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctstatus FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctstatus TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctstatus TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ctphase (like clinical1.ctphase including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ctphase FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ctphase TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ctphase TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2 (like clinical1.ct2 including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctphase_fk_fkey foreign key (ctphase_fk) references clinical2.ctphase(ctphase_id);&lt;br /&gt;
  alter table clinical2.ct2 add constraint ct2_ctstatus_fk_fkey foreign key (ctstatus_fk) references clinical2.ctstatus(ctstatus_id);&lt;br /&gt;
  alter table clinical2.ct2 add column changed_date date;&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2 FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2 TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2 TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condclass (like clinical1.ct2condclass including defaults including constraints including indexes);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condclass FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condclass TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condclass TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2condition (like clinical1.ct2condition including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2condition add constraint ct2condition_condclass_fk_fkey foreign key (condclass_fk) references clinical2.ct2condclass(ct2condclass_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2condition FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2condition TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2condition TO admin;&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2tocond (like clinical1.ct2tocond including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2condition_fk_fkey foreign key (ct2condition_fk) references clinical2.ct2condition(ct2condition_id);&lt;br /&gt;
  alter table clinical2.ct2tocond add constraint ct2tocond_ct2_fk_fkey foreign key (ct2_fk) references clinical2.ct2(ct2_id);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2tocond FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2tocond TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2tocond TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2int (like clinical1.ct2int including defaults including constraints including indexes);&lt;br /&gt;
  alter tabel clinical2.ct2int drop column ct2_fk;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2int FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2int TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2int TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2subint (like clinical1.ct2subint including defaults including constraints including indexes);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_ct2int_fk_fkey foreign key (ct2int_fk) references clinical2.ct2int(ct2int_id);&lt;br /&gt;
  alter table clinical2.ct2subint add constraint ct2subint_sub_id_fk_fkey foreign key (sub_id_fk) references substance(sub_id) ON UPDATE CASCADE ON DELETE CASCADE DEFERRABLE INITIALLY DEFERRED;&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2subint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2subint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2subint TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  create table clinical2.ct2toint (ct2toint_id integer primary key serial not null,&lt;br /&gt;
                                 ct2_fk integer references clinical2.ct2,&lt;br /&gt;
                                 ct2int_fk integer references clinical2.ct2int)&lt;br /&gt;
&lt;br /&gt;
  create sequence clinical2.ct2toint_ct2toint_seq;&lt;br /&gt;
  alter table clinical2.ct2toint alter column ct2toint_id set data type not null default nextval(&#039;clinical2.ct2toint_ct2toint_seq&#039;::regclass);&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint_ct2toint_seq FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint_ct2toint_seq TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint_ct2toint_seq TO admin;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  REVOKE ALL ON TABLE clinical2.ct2toint FROM PUBLIC;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO root;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincread;&lt;br /&gt;
  GRANT SELECT ON TABLE clinical2.ct2toint TO zincfree;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO test;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO adminprivate;&lt;br /&gt;
  GRANT ALL ON TABLE clinical2.ct2toint TO admin;&lt;br /&gt;
&lt;br /&gt;
 3. Load data from existing tables&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ctstatus select * from clinical1.ctstatus;&lt;br /&gt;
  insert into clinical2.ctphase select * from clinical1.ctphase;&lt;br /&gt;
  insert into clinical2.ct2condclass select * from clinical1.ct2condclass;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 4. Matching the name between subname.name and intervention.name&lt;br /&gt;
&lt;br /&gt;
  create temp table subname&lt;br /&gt;
    as select sub_id_fk as sub_id_fk, who_name as name&lt;br /&gt;
       from catalog_item join catalog on (cat_id_fk=cat_id and short_name=&#039;chembl20&#039;)&lt;br /&gt;
                         join chembl20.molecule_dictionary as md on supplier_code=md.chembl_id&lt;br /&gt;
                         join chembl20.molecule_atc_classification as mac on md.molregno=mac.molregno&lt;br /&gt;
                         join chembl20.atc_classification as ac on mac.level5=ac.level5;&lt;br /&gt;
&lt;br /&gt;
  insert into subname&lt;br /&gt;
    select cs.sub_id_fk, sy.synonym&lt;br /&gt;
    from catalog_substance as cs join synonym as s on cs.cat_content_fk = s.cat_content_fk&lt;br /&gt;
    where not exists (select 1 from subname as sn where sn.sub_id_fk = cs.sub_id_fk and sn.name = sy.synonym);&lt;br /&gt;
&lt;br /&gt;
  alter table subname add column q tsquery;&lt;br /&gt;
&lt;br /&gt;
  update table subname as s&lt;br /&gt;
   set q=plainto_tsquery(t.who_name)&lt;br /&gt;
   from subname as t&lt;br /&gt;
   where s.sub_id_fk=t.sub_id_fk and s.who_name=t.who_name;&lt;br /&gt;
&lt;br /&gt;
  alter table clinical2.ct2int add column terms tsvector;&lt;br /&gt;
  update table clinical2.ct2int set term=to_tsvector(&#039;english&#039;, name)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  insert into clinical2.ct2subint (sub_id_fk, ct2int_fk)&lt;br /&gt;
    select distinct sub.sub_id_fk, int.ct2int_id&lt;br /&gt;
    from clinical2.ct2int as int join subname as sub on&lt;br /&gt;
                         int.terms@@sub.query;&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10273</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10273"/>
		<updated>2017-08-24T19:58:29Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    zinc/SQL_statement/clinical_trial.sql&lt;br /&gt;
&lt;br /&gt;
 3. Load raw data to the database tables:&lt;br /&gt;
    If you want to delete all the existing data, then use --wipe&lt;br /&gt;
    Otherwise it will load data incrementally to the existing data.&lt;br /&gt;
&lt;br /&gt;
    source /nfs/soft/www/apps/zinc15/envs/dev/bin/activate&lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_ct /nfs/db/trials/important/studies.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_condition /nfs/db/trials/important/browse_conditions.txt &lt;br /&gt;
    python /nfs/soft/www/apps/zinc15/envs/dev/bin/zinc-manage utils clinical-trials load_intervention /nfs/db/trials/important/interventions.txt&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10272</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10272"/>
		<updated>2017-08-24T19:03:23Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2. SQL queries to create a new clinical trial schema and table are located in zinc code:&lt;br /&gt;
&lt;br /&gt;
    SQL_statement/clinical_trial&lt;br /&gt;
&lt;br /&gt;
 3. To load the data from raw data:&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10266</id>
		<title>Clinical Trials Loading</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Clinical_Trials_Loading&amp;diff=10266"/>
		<updated>2017-08-17T19:04:07Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Created page with &amp;quot; 1. Relevant Files:     /nfs/db/trials/important --&amp;gt; clinical trials raw data      /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data     http:/...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt; 1. Relevant Files:&lt;br /&gt;
    /nfs/db/trials/important --&amp;gt; clinical trials raw data &lt;br /&gt;
    /nfs/home/teague/work/Projects/trials/extract.py --&amp;gt; script to clean the raw data&lt;br /&gt;
    http://wiki.docking.org/index.php/Creating_clinical_name_mappings --&amp;gt; how to create name_mappings&lt;br /&gt;
&lt;br /&gt;
 2.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10243</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10243"/>
		<updated>2017-08-09T21:07:14Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  screen -r sea16 on gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  sh SEAserver/scripts/stop-sea-server.sh&lt;br /&gt;
  sh SEAserver/scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  su - www&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries (fitcore, fpcore, libcore, seacore, seashell, seaserver) from your site-packages folder under the conda env&lt;br /&gt;
  rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver &lt;br /&gt;
  make all&lt;br /&gt;
  make SEAserver-start-production  &lt;br /&gt;
&lt;br /&gt;
# run the test:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
# Redis config&lt;br /&gt;
&lt;br /&gt;
    As for the Redis warnings, these are not new, but probably something worth taking care of (https://redis.io/topics/admin):&lt;br /&gt;
&lt;br /&gt;
    add &#039;vm.overcommit_memory = 1&#039; to /etc/sysctl.conf&lt;br /&gt;
 &lt;br /&gt;
    From here: The Linux kernel will always overcommit memory, and never check if enough memory is available. This increases the risk of out-of-memory situations, but also improves memory-intensive workloads.&lt;br /&gt;
&lt;br /&gt;
    run echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
    add &#039;echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&#039; to /etc/rc.local&lt;br /&gt;
  &lt;br /&gt;
    From here: Latency induced by transparent huge pages&lt;br /&gt;
&lt;br /&gt;
    Unfortunately when a Linux kernel has transparent huge pages enabled, Redis incurs to a big latency penalty after the fork call is used in order to persist on disk. Huge pages are the cause of the following issue:&lt;br /&gt;
&lt;br /&gt;
    Fork is called, two processes with shared huge pages are created.&lt;br /&gt;
    In a busy instance, a few event loops runs will cause commands to target a few thousand of pages, causing the copy on write of almost the whole process memory.&lt;br /&gt;
    This will result in big latency and big memory usage.&lt;br /&gt;
&lt;br /&gt;
    Make sure to disable transparent huge pages using the following command:&lt;br /&gt;
&lt;br /&gt;
    echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10219</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10219"/>
		<updated>2017-07-24T18:47:55Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
&lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
&lt;br /&gt;
 4. Load the openfda regulatory_status data (populates &#039;Status&#039; table in db, last updated April/2017): &lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 &lt;br /&gt;
 5. Load the openfda excipients function data (populates &#039;Function&#039; table in db, last updated April/2017):&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
&lt;br /&gt;
 6. Load the dye relation data. (creates the relation for excipients to functions which are dye):&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
&lt;br /&gt;
 7. Load the fda general additives data.(creates all other function and status relations to excipients):  &lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
&lt;br /&gt;
 8. Load the openfda drug label. (populate brand and substance tables and their relations to existing excipients):&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 &lt;br /&gt;
 9. Pull Zincids for populated Substance data. (populate the zincid and smiles column in substance table)&lt;br /&gt;
      -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data&lt;br /&gt;
&lt;br /&gt;
 DB should be now fully populated.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10198</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10198"/>
		<updated>2017-07-06T17:45:58Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  screen -r sea16 on gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  sh SEAserver/scripts/stop-sea-server.sh&lt;br /&gt;
  sh SEAserver/scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  make clean&lt;br /&gt;
 delete sea related libraries (fitcore, fpcore, libcore, seacore, seashell, seaserver) from your site-packages folder under the conda env&lt;br /&gt;
  rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver &lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start-production  cd SEAserver&lt;br /&gt;
&lt;br /&gt;
# run the test:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
# Redis config&lt;br /&gt;
&lt;br /&gt;
    As for the Redis warnings, these are not new, but probably something worth taking care of (https://redis.io/topics/admin):&lt;br /&gt;
&lt;br /&gt;
    add &#039;vm.overcommit_memory = 1&#039; to /etc/sysctl.conf&lt;br /&gt;
 &lt;br /&gt;
    From here: The Linux kernel will always overcommit memory, and never check if enough memory is available. This increases the risk of out-of-memory situations, but also improves memory-intensive workloads.&lt;br /&gt;
&lt;br /&gt;
    run echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
    add &#039;echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&#039; to /etc/rc.local&lt;br /&gt;
  &lt;br /&gt;
    From here: Latency induced by transparent huge pages&lt;br /&gt;
&lt;br /&gt;
    Unfortunately when a Linux kernel has transparent huge pages enabled, Redis incurs to a big latency penalty after the fork call is used in order to persist on disk. Huge pages are the cause of the following issue:&lt;br /&gt;
&lt;br /&gt;
    Fork is called, two processes with shared huge pages are created.&lt;br /&gt;
    In a busy instance, a few event loops runs will cause commands to target a few thousand of pages, causing the copy on write of almost the whole process memory.&lt;br /&gt;
    This will result in big latency and big memory usage.&lt;br /&gt;
&lt;br /&gt;
    Make sure to disable transparent huge pages using the following command:&lt;br /&gt;
&lt;br /&gt;
    echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10120</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10120"/>
		<updated>2017-05-30T18:51:28Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  screen -r sea16 on gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/stop-sea-server.sh&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries (fitcore, fpcore, libcore, seacore, seashell, seaserver) from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
# run the test:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
# Redis config&lt;br /&gt;
&lt;br /&gt;
    As for the Redis warnings, these are not new, but probably something worth taking care of (https://redis.io/topics/admin):&lt;br /&gt;
&lt;br /&gt;
    add &#039;vm.overcommit_memory = 1&#039; to /etc/sysctl.conf&lt;br /&gt;
 &lt;br /&gt;
    From here: The Linux kernel will always overcommit memory, and never check if enough memory is available. This increases the risk of out-of-memory situations, but also improves memory-intensive workloads.&lt;br /&gt;
&lt;br /&gt;
    run echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
    add &#039;echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&#039; to /etc/rc.local&lt;br /&gt;
  &lt;br /&gt;
    From here: Latency induced by transparent huge pages&lt;br /&gt;
&lt;br /&gt;
    Unfortunately when a Linux kernel has transparent huge pages enabled, Redis incurs to a big latency penalty after the fork call is used in order to persist on disk. Huge pages are the cause of the following issue:&lt;br /&gt;
&lt;br /&gt;
    Fork is called, two processes with shared huge pages are created.&lt;br /&gt;
    In a busy instance, a few event loops runs will cause commands to target a few thousand of pages, causing the copy on write of almost the whole process memory.&lt;br /&gt;
    This will result in big latency and big memory usage.&lt;br /&gt;
&lt;br /&gt;
    Make sure to disable transparent huge pages using the following command:&lt;br /&gt;
&lt;br /&gt;
    echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10119</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10119"/>
		<updated>2017-05-30T18:51:08Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  screen -r sea16 on gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/stop-sea-server.sh&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries (fitcore, fpcore, libcore, seacore, seashell, seaserver) from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
# run the test:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
# &lt;br /&gt;
    As for the Redis warnings, these are not new, but probably something worth taking care of (https://redis.io/topics/admin):&lt;br /&gt;
&lt;br /&gt;
    add &#039;vm.overcommit_memory = 1&#039; to /etc/sysctl.conf&lt;br /&gt;
 &lt;br /&gt;
    From here: The Linux kernel will always overcommit memory, and never check if enough memory is available. This increases the risk of out-of-memory situations, but also improves memory-intensive workloads.&lt;br /&gt;
&lt;br /&gt;
    run echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
    add &#039;echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&#039; to /etc/rc.local&lt;br /&gt;
  &lt;br /&gt;
    From here: Latency induced by transparent huge pages&lt;br /&gt;
&lt;br /&gt;
    Unfortunately when a Linux kernel has transparent huge pages enabled, Redis incurs to a big latency penalty after the fork call is used in order to persist on disk. Huge pages are the cause of the following issue:&lt;br /&gt;
&lt;br /&gt;
    Fork is called, two processes with shared huge pages are created.&lt;br /&gt;
    In a busy instance, a few event loops runs will cause commands to target a few thousand of pages, causing the copy on write of almost the whole process memory.&lt;br /&gt;
    This will result in big latency and big memory usage.&lt;br /&gt;
&lt;br /&gt;
    Make sure to disable transparent huge pages using the following command:&lt;br /&gt;
&lt;br /&gt;
    echo never &amp;gt; /sys/kernel/mm/transparent_hugepage/enabled&lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10118</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10118"/>
		<updated>2017-05-30T18:43:56Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  screen -r sea16 on gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/stop-sea-server.sh&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries (fitcore, fpcore, libcore, seacore, seashell, seaserver) from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
# run the test:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10111</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10111"/>
		<updated>2017-05-18T21:18:38Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee$n-1-110&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  git submodule update --init --recursive&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
# run the test to check:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10071</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10071"/>
		<updated>2017-04-24T23:39:28Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submits jobs to build ligands for a given smiles file:&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ). See below the explanation of arguments and change them accordingly!:&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
:b. run the script with a smiles file:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which breaks up the smiles file and submits it to the queue. Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_2017-04&amp;diff=10070</id>
		<title>Ligand preparation - 2017-04</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_2017-04&amp;diff=10070"/>
		<updated>2017-04-24T23:14:35Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Enkhjargal moved page Ligand preparation - 2017-04 to Ligand preparation - 20170424&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Ligand preparation - 20170424]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10069</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10069"/>
		<updated>2017-04-24T23:14:35Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Enkhjargal moved page Ligand preparation - 2017-04 to Ligand preparation - 20170424&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submits jobs to build ligands for a given smiles file:&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
:b. run the script with a smiles file:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which breaks up the smiles file and submits it to the queue. Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10068</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10068"/>
		<updated>2017-04-24T23:13:16Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submits jobs to build ligands for a given smiles file:&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
:b. run the script with a smiles file:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which breaks up the smiles file and submits it to the queue. Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10067</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10067"/>
		<updated>2017-04-24T23:00:08Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submit jobs with :&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
:b. run the script with a smiles file:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which breaks up the smiles file and submits it to the queue. Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10066</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10066"/>
		<updated>2017-04-24T22:59:31Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submit jobs with :&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
:b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which breaks up the smiles file and submits it to the queue. Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10065</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10065"/>
		<updated>2017-04-24T22:58:35Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submit jobs with :&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
:b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt; submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which braeks up the smiles file and submits it to the queue.  Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_prep_Irwin_Nov_2016&amp;diff=10064</id>
		<title>Ligand prep Irwin Nov 2016</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_prep_Irwin_Nov_2016&amp;diff=10064"/>
		<updated>2017-04-24T22:57:27Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Enkhjargal moved page Ligand prep Irwin Nov 2016 to Ligand preparation - 2017-04&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Ligand preparation - 2017-04]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10063</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10063"/>
		<updated>2017-04-24T22:57:26Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Enkhjargal moved page Ligand prep Irwin Nov 2016 to Ligand preparation - 2017-04&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submit jobs with :&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
:b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt;  Submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; But run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which braeks up the smiles file and submits it to the queue.  Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10062</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10062"/>
		<updated>2017-04-24T22:56:41Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the dock environment and run the stand-alone script to build ligands for a given smiles file:&lt;br /&gt;
   &lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk     &lt;br /&gt;
&lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
2. Source the dock environment and submit jobs with :&lt;br /&gt;
&lt;br /&gt;
:a. create a file with following commands (in this example, the file&#039;s name is cmd ):&lt;br /&gt;
      &lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/soft/tools/utils/qsub-slice/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
:b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
What does the qsub submit script exactly do?&lt;br /&gt;
      -l 100 --&amp;gt;  Submit slices of 100 lines per task to run the script build_database_ligand.sh.   &lt;br /&gt;
      -tc 10 --&amp;gt; But run only 10 tasks at any given time. &lt;br /&gt;
      -s $BUILD_ENVIRONMENT --&amp;gt; source the dock environment before each task is run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3.  the following is a wrapper script (wrapper_queue_build_smiles_ligand_corina.csh) which braeks up the smiles file and submits it to the queue.  Users may find this script gives more control over the &amp;quot;qsub-mr-meta&amp;quot; way although it is less polished. &lt;br /&gt;
 #! /bin/csh&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 #set number_per_db2 = 1000&lt;br /&gt;
 #set number_per_db2 = 5&lt;br /&gt;
 set number_per_db2 = 10&lt;br /&gt;
 set ph = 7.4&lt;br /&gt;
 &lt;br /&gt;
 set file = $1&lt;br /&gt;
 set fileprefix = ${file:r}&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 set pwd = `pwd`&lt;br /&gt;
 set pathdir = $pwd&lt;br /&gt;
 set workdir = $pwd/${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 echo ${workdir}&lt;br /&gt;
 echo ${file} ${fileprefix}&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${workdir}) then&lt;br /&gt;
  echo &amp;quot;${workdir} exists&amp;quot;&lt;br /&gt;
  exit&lt;br /&gt;
 endif&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 #rm -rf ${workdir}&lt;br /&gt;
 mkdir ${workdir}&lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 &lt;br /&gt;
 ln -s ../${file} .&lt;br /&gt;
 &lt;br /&gt;
 echo &amp;quot;split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&amp;quot;&lt;br /&gt;
 &lt;br /&gt;
 split --lines=$number_per_db2 --suffix-length=6  ${file} ${fileprefix}_split&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 foreach splitfile ( ` ls ${fileprefix}_split* | grep -v db2.gz ` ) &lt;br /&gt;
 echo ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # make sure that the link is pointing to something.  &lt;br /&gt;
 #set lsoutput = `ls -l sgejob_*/${splitfile}.db2.gz`&lt;br /&gt;
 #echo &amp;quot;WHAT:: $lsoutput&amp;quot;&lt;br /&gt;
 #if (&amp;quot;$lsoutput&amp;quot; == &amp;quot;&amp;quot;) then&lt;br /&gt;
 #    rm ${splitfile}.db2.gz&lt;br /&gt;
 #endif&lt;br /&gt;
 &lt;br /&gt;
 if (-e ${splitfile}.db2.gz) then&lt;br /&gt;
    echo &amp;quot;${splitfile} has been submitted for generations.&amp;quot; &lt;br /&gt;
    continue&lt;br /&gt;
 endif &lt;br /&gt;
 &lt;br /&gt;
 cat &amp;lt;&amp;lt; EOF &amp;gt;! script_qsub_${splitfile}.csh&lt;br /&gt;
 #\$ -S /bin/csh&lt;br /&gt;
 #\$ -cwd&lt;br /&gt;
 #\$ -q all.q&lt;br /&gt;
 #\$ -o stdout_${splitfile}&lt;br /&gt;
 #\$ -e stderr_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 # source enviorment. &lt;br /&gt;
 source ~tbalius/.cshrc_dbgen_corina&lt;br /&gt;
 &lt;br /&gt;
 hostname&lt;br /&gt;
 date&lt;br /&gt;
 &lt;br /&gt;
 set SCRATCH_DIR = /scratch&lt;br /&gt;
 if ! (-d \$SCRATCH_DIR ) then&lt;br /&gt;
     SCRATCH_DIR=/tmp&lt;br /&gt;
 endif&lt;br /&gt;
 set username = `whoami`&lt;br /&gt;
 &lt;br /&gt;
 set TASK_DIR = &amp;quot;\$SCRATCH_DIR/\${username}/\$JOB_ID&amp;quot;&lt;br /&gt;
 echo \$TASK_DIR&lt;br /&gt;
 &lt;br /&gt;
 mkdir -p \${TASK_DIR}&lt;br /&gt;
 cd \${TASK_DIR}&lt;br /&gt;
 pwd&lt;br /&gt;
 &lt;br /&gt;
 cp ${workdir}/${splitfile} .&lt;br /&gt;
 &lt;br /&gt;
 # note that the pining script&#039;s inputs should not be in quotes (&#039;&#039; or &amp;quot;&amp;quot;).&lt;br /&gt;
 /nfs/home/tbalius/zzz.github/DOCK/common/on-one-core - ${DOCKBASE}/ligand/generate/build_database_ligand.sh -H $ph ${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 cd ${workdir}&lt;br /&gt;
 mkdir sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 &lt;br /&gt;
 echo copying&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db2.gz&lt;br /&gt;
 ls -l \${TASK_DIR}/finished/*/*.db.gz &lt;br /&gt;
 &lt;br /&gt;
 # copy finshed directory&lt;br /&gt;
 mv \${TASK_DIR}/finished/ sgejob_\${JOB_ID}_${splitfile}&lt;br /&gt;
 rm -r \${TASK_DIR}&lt;br /&gt;
 &lt;br /&gt;
 EOF&lt;br /&gt;
 &lt;br /&gt;
 while ( `qstat -u tbalius | wc -l ` &amp;gt; 10 )&lt;br /&gt;
   sleep 10&lt;br /&gt;
 end&lt;br /&gt;
 &lt;br /&gt;
 &lt;br /&gt;
 qsub script_qsub_${splitfile}.csh&lt;br /&gt;
 &lt;br /&gt;
 #exit&lt;br /&gt;
 &lt;br /&gt;
 end&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10061</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10061"/>
		<updated>2017-04-24T22:10:41Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
&lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
&lt;br /&gt;
 4. Load the openfda regulatory_status data (populates &#039;Status&#039; table in db):&lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 &lt;br /&gt;
 5. Load the openfda excipients function data (populates &#039;Function&#039; table in db):&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
&lt;br /&gt;
 6. Load the dye relation data. (creates the relation for excipients to functions which are dye):&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
&lt;br /&gt;
 7. Load the fda general additives data.(creates all other function and status relations to excipients):  &lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
&lt;br /&gt;
 8. Load the openfda drug label. (populate brand and substance tables and their relations to existing excipients):&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 &lt;br /&gt;
 9. Pull Zincids for populated Substance data. (populate the zincid and smiles column in substance table)&lt;br /&gt;
      -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data&lt;br /&gt;
&lt;br /&gt;
 DB should be now fully populated.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10059</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10059"/>
		<updated>2017-04-24T22:10:13Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Enkhjargal moved page Deployment to Excipients&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the brand new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
&lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
&lt;br /&gt;
 4. Load the openfda regulatory_status data (populates &#039;Status&#039; table in db):&lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 &lt;br /&gt;
 5. Load the openfda excipients function data (populates &#039;Function&#039; table in db):&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
&lt;br /&gt;
 6. Load the dye relation data. (creates the relation for excipients to functions which are dye):&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
&lt;br /&gt;
 7. Load the fda general additives data.(creates all other function and status relations to excipients):  &lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
&lt;br /&gt;
 8. Load the openfda drug label. (populate brand and substance tables and their relations to existing excipients):&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 &lt;br /&gt;
 9. Pull Zincids for populated Substance data. (populate the zincid and smiles column in substance table)&lt;br /&gt;
      -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data&lt;br /&gt;
&lt;br /&gt;
 DB should be now fully populated.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10058</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10058"/>
		<updated>2017-04-24T22:04:04Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the brand new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
&lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
&lt;br /&gt;
 4. Load the openfda regulatory_status data (populates &#039;Status&#039; table in db):&lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 &lt;br /&gt;
 5. Load the openfda excipients function data (populates &#039;Function&#039; table in db):&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
&lt;br /&gt;
 6. Load the dye relation data. (creates the relation for excipients to functions which are dye):&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
&lt;br /&gt;
 7. Load the fda general additives data.(creates all other function and status relations to excipients):  &lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
&lt;br /&gt;
 8. Load the openfda drug label. (populate brand and substance tables and their relations to existing excipients):&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 &lt;br /&gt;
 9. Pull Zincids for populated Substance data. (populate the zincid and smiles column in substance table)&lt;br /&gt;
      -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data&lt;br /&gt;
&lt;br /&gt;
 DB should be now fully populated.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10057</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10057"/>
		<updated>2017-04-24T22:03:22Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the brand new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
&lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
&lt;br /&gt;
 4. Load the openfda regulatory_status data (populates &#039;Status&#039; table in db):&lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 &lt;br /&gt;
 5. Load the openfda excipients function data (populates &#039;Function&#039; table in db):&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
&lt;br /&gt;
 6. Load the dye relation data. (creates the relation for excipients to functions which are dye):&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
&lt;br /&gt;
 7. Load the fda general additives data.(creates all other function and status relations to excipients):  &lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
&lt;br /&gt;
 8. Load the openfda drug label. (populate brand and substance tables and their relations to existing excipients):&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 &lt;br /&gt;
 9. Pull Zincids for populated Substance data. (populate the zincid and smiles column in substance table)&lt;br /&gt;
      -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10056</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10056"/>
		<updated>2017-04-24T21:53:48Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
[[HowTos for the brand new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. Create a copy of existing production database&lt;br /&gt;
 2. In the flask code, change the database url to the newly created db &lt;br /&gt;
 3. Run db update + db migrate to create new tables and new columns for existing tables. (automatically done by flask db-migrate, mapping the model objects to postgres db) &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db upgrade&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py manage-db migrate&lt;br /&gt;
 4. Load the openfda regulatory_status data (creates &#039;Status&#039; table in db)&lt;br /&gt;
    -&amp;gt;  create a file named regulatory_status.csv&lt;br /&gt;
        touch regulatory_status.csv&lt;br /&gt;
    -&amp;gt; copy over the regulatory_status definitions from &lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_status_definition /home/enkhjargal/PycharmProjects/Excipients/data/regulatory_status.csv&lt;br /&gt;
 5. Load the openfda excipients function data (creates &#039;Function&#039; table in db)&lt;br /&gt;
    python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_function_definition /home/enkhjargal/PycharmProjects/Excipients/data/function_definition.csv&lt;br /&gt;
 6. Load the dye relation data:&lt;br /&gt;
    -&amp;gt; https://www.fda.gov/ForIndustry/ColorAdditives/ColorAdditiveInventories/ucm106626.htm)&lt;br /&gt;
    -&amp;gt; python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_color_additives_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_color_additives_all&lt;br /&gt;
 7. Load the fda general additives data&lt;br /&gt;
    -&amp;gt; create a file named FDA_additives&lt;br /&gt;
       touch FDA_additives&lt;br /&gt;
    -&amp;gt; copy over the additive list from this page (https://www.fda.gov/Food/IngredientsPackagingLabeling/FoodAdditivesIngredients/ucm091048.htm)&lt;br /&gt;
    -&amp;gt; parse the file to load it to db table&lt;br /&gt;
       python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py load_function_and_status_relationship /home/enkhjargal/PycharmProjects/Excipients/data/FDA_additives&lt;br /&gt;
 8. Load the openfda drug label:&lt;br /&gt;
      -&amp;gt; https://open.fda.gov/downloads/ (6 json files to download)&lt;br /&gt;
      -&amp;gt; parse each of them:&lt;br /&gt;
         python /home/enkhjargal/PycharmProjects/Excipients/manage-excipients.py parse_fda_label_data /home/enkhjargal/PycharmProjects/Excipients/data/drug-label-0005-of-0006.json&lt;br /&gt;
 9.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10055</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10055"/>
		<updated>2017-04-24T20:35:19Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Excipient Server Restart:]]&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
 [[HowTos for the brand new version of Excipients]]&lt;br /&gt;
&lt;br /&gt;
 1. How to load openfda data to the database:&lt;br /&gt;
&lt;br /&gt;
1.&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10054</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10054"/>
		<updated>2017-04-24T20:32:39Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Excipient Server Restart:&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
  4. su - www&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10053</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=10053"/>
		<updated>2017-04-24T20:29:49Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Excipient Server Restart:&lt;br /&gt;
&lt;br /&gt;
  1. ssh gimel&lt;br /&gt;
  2. become www&lt;br /&gt;
  3. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
     source bin/activate&lt;br /&gt;
  4. gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000&lt;br /&gt;
&lt;br /&gt;
[[Excipient Installment:]]&lt;br /&gt;
&lt;br /&gt;
  1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
&lt;br /&gt;
  2. Create the distribution file&lt;br /&gt;
     python setup.py sdist&lt;br /&gt;
&lt;br /&gt;
  3. ssh gimel&lt;br /&gt;
&lt;br /&gt;
  4. su - www&lt;br /&gt;
&lt;br /&gt;
  5. activate the production server -&amp;gt;&lt;br /&gt;
     cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
      source bin/activate&lt;br /&gt;
&lt;br /&gt;
  6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
&lt;br /&gt;
  7. Run -&amp;gt; &lt;br /&gt;
     pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
&lt;br /&gt;
  8.&lt;br /&gt;
     gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10048</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=10048"/>
		<updated>2017-04-24T16:35:10Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# restart conda_sea16 server&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee$n-1-110&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16 &lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH&lt;br /&gt;
  export PATH=/nfs/home/momeara/opt/bin:$PATH&lt;br /&gt;
  source activate sea16&lt;br /&gt;
  git pull&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
 &lt;br /&gt;
# run the test to check:&lt;br /&gt;
     -&amp;gt; export SEA_APP_ROOT=$CONDA_PREFIX/var/seaserver&lt;br /&gt;
     -&amp;gt; export SEA_RUN_FOLDER=$SEA_APP_ROOT/run&lt;br /&gt;
     -&amp;gt; export SEA_DATA_FOLDER=$SEA_APP_ROOT/data&lt;br /&gt;
     -&amp;gt;  python -m unittest test.test_illustrate&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10004</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10004"/>
		<updated>2017-04-10T23:01:06Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the environment and run the stand-alone script:&lt;br /&gt;
  &lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
   sh $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Source the environment and submit jobs:&lt;br /&gt;
&lt;br /&gt;
   a. put these commands in cmd file:&lt;br /&gt;
&lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
&lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
   b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10003</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10003"/>
		<updated>2017-04-10T22:54:10Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Source the environment and run the stand-alone script:&lt;br /&gt;
  &lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
   $DOCKBASE/ligand/generate/build_database_ligand.sh mysmiles.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Source the environment and submit jobs:&lt;br /&gt;
&lt;br /&gt;
   a. put these commands in cmd file:&lt;br /&gt;
&lt;br /&gt;
      setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
      setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
      /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
      &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
      -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
   b. run the script:&lt;br /&gt;
&lt;br /&gt;
      csh cmd mysmiles.smi&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10002</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10002"/>
		<updated>2017-04-10T22:41:00Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Run the stand-alone script:&lt;br /&gt;
  &lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
   $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Submit jobs:&lt;br /&gt;
&lt;br /&gt;
   setenv DOCKBASE /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk &lt;br /&gt;
   setenv BUILD_ENVIRONMENT /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
&lt;br /&gt;
   /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
   &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
   -l 100 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10001</id>
		<title>Ligand preparation - 20170424</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Ligand_preparation_-_20170424&amp;diff=10001"/>
		<updated>2017-04-10T21:47:06Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: cd ..&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== 3 ways to build ligands: ==&lt;br /&gt;
&lt;br /&gt;
1. Run the stand-alone script:&lt;br /&gt;
  &lt;br /&gt;
   source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.sh&lt;br /&gt;
   $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
2. Submit jobs:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== set up environment: jchem mitools, DOCK3.7, ZINC, corina == &lt;br /&gt;
&lt;br /&gt;
 source /nfs/soft/jchem/current/env.csh&lt;br /&gt;
 source /nfs/soft/mitools/env.csh&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 deactivate&lt;br /&gt;
 source /nfs/soft/www/apps/zinc15/envs/edge/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh &lt;br /&gt;
 setenv ZINC_CONFIG_ENV admin&lt;br /&gt;
 setenv ZINC_CONFIG_SETUP_SKIP blueprints &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== step 2. csh cmd mysmiles.ism == &lt;br /&gt;
&lt;br /&gt;
 # put any change in the BUILD_ENVIRONMENT&lt;br /&gt;
 source /nfs/soft/dock/versions/dock37/DOCK-3.7-trunk/env.csh&lt;br /&gt;
 source /nfs/soft/corina/current/env.csh&lt;br /&gt;
 setenv EMBED_PROTOMERS_3D_EXE $DOCKBASE/ligand/3D/embed3d_corina.sh&lt;br /&gt;
 setenv BUILD_ENVIRONMENT /nfs/home/xyz/bin/dockenvNS.sh&lt;br /&gt;
 #&lt;br /&gt;
 /nfs/soft/tools/utils/qsub-slice/qsub-mr-meta -tc 10 -L 100000 --map-instance-script \&lt;br /&gt;
 &amp;quot;/nfs/scratch/A/xyz/protomer/qsub-mr-map.sh&amp;quot; -s $BUILD_ENVIRONMENT \&lt;br /&gt;
 -l 300 $1 $DOCKBASE/ligand/generate/build_database_ligand.sh&lt;br /&gt;
&lt;br /&gt;
If you have any trouble using this  (which the procedure used to build ZINC between July 14, 2016 and Nov 14, 2016 (ongoing), &lt;br /&gt;
please contact me. I would be happy to help debug, as it will help us to make this procedure more robust.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9999</id>
		<title>Updating Rdkit</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9999"/>
		<updated>2017-04-10T18:34:14Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Updating RDKIT:&lt;br /&gt;
&lt;br /&gt;
1. Download RDKit Tarbal&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Create a python virtual env:&lt;br /&gt;
&lt;br /&gt;
    /nfs/soft/python/install/scripts/create-virtualenv.sh -s /nfs/soft/python/versions/python-2.7.11 /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
3. Install the new version of RDKIT in the python env:&lt;br /&gt;
&lt;br /&gt;
   /nfs/soft/python/install/scripts/install-rdkit.sh /nfs/soft/python/install/extra/RDKit_2016_03_1.tgz /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
If you need to update the boost:&lt;br /&gt;
&lt;br /&gt;
1. Download the new version (http://www.boost.org/)&lt;br /&gt;
&lt;br /&gt;
2. Extract the tar file and put it in /usr/local/src&lt;br /&gt;
&lt;br /&gt;
3. prepare Boost for building the libraries:&lt;br /&gt;
   ./bootstrap.sh ---with-python=/nfs/soft/python/versions/pythoon-2.7.11/bin/python&lt;br /&gt;
&lt;br /&gt;
4. build and install Boost:&lt;br /&gt;
   ./b2 --prefix=path_to_python_virtual_env/local/lib&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Notes =  See [[RDKit]]&lt;br /&gt;
&lt;br /&gt;
Each version is now installed in $PYTHON_ROOT/local/rdkit-VERSION. The current version is a symlink now named $PYTHON_ROOT/local/rdkit.&lt;br /&gt;
&lt;br /&gt;
It&#039;s worth noting how this works. There is a directory in $PYTHON_ROOT named $PYTHON_ROOT/local. In addition to containing the build of RDKit as described above ($PYTHONR_ROOT/local/rdkit), it also contains $PYTHON_ROOT/lib, which is simply a directory of simlinks to important shared libraries. &lt;br /&gt;
&lt;br /&gt;
When compiling Python, set LD_RUN_PATH to $ORIGIN/../lib:$ORIGIN/../local/lib so that it always looks in those two directories ($ORIGIN absolute path of the python binary). This way python can find important shared libraries, such as RDKit, modeller, etc. When we create a new virtualenv, we create a symlink to the original local directory in the virtualenv, so that we can always find the right shared libraries.&lt;br /&gt;
&lt;br /&gt;
To make sure we always have the right python package installed there is a file created in:&lt;br /&gt;
$PYTHON_ROOT/lib/python2.7/site-pacakges/rdkit.pth&lt;br /&gt;
In Python package resolution, .pth files allow you to specify an additional location to look for a module. This file includes one line:&lt;br /&gt;
../../../local/rdkit&lt;br /&gt;
Which looks for the rdkit package in the (symlink to) the current version of RDKit. Note that the RDkit python package has the path:&lt;br /&gt;
$PYTHON_ROOT/local/rdkit/rdkit (two rdkits)&lt;br /&gt;
&lt;br /&gt;
PTH files notes:&lt;br /&gt;
There are actually two other pth files I have created: modeller.pth and hask.pth&lt;br /&gt;
It would probably be best to simply create one pth file to handle these packages stored in the &amp;quot;local&amp;quot; directory:&lt;br /&gt;
Create a directory called local/packages or something and put symlinks to rdkit, haks, modeller, etc. in there. then create a single pth file with the line:&lt;br /&gt;
../../../local/packages&lt;br /&gt;
&lt;br /&gt;
= Install procedure =&lt;br /&gt;
* 0) Have a Python installation you wish to install for (This should be the base installation for virtualenvs)&lt;br /&gt;
* 1) Download RDKit Tarball&lt;br /&gt;
* 2) Extract RDKit Tarball: $RDKIT_SRC&lt;br /&gt;
* 3) Download INCHI support: cd $RDKIT_SRC/External/INCHI-API; ./download-inchi.sh&lt;br /&gt;
* 4) Create build directory: cd $RDKIT_SRC; mkdir build; cd build&lt;br /&gt;
* 5) Configure with cmake (This is a long command! and note the version specific directories/files)&lt;br /&gt;
&lt;br /&gt;
 cmake \&lt;br /&gt;
 -DPYTHON_EXECUTABLE=$PYTHON_PREFIX/bin/python \&lt;br /&gt;
 -DPYTHON_INCLUDE_PATH=$PYTHON_PREFIX/include/python2.7 \&lt;br /&gt;
 -DPYTHON_LIBRARY=$PYTHON_PREFIX/lib/libpython2.7.so \&lt;br /&gt;
 -DPYTHON_NUMPY_INCLUDE_PATH=$PYTHON_PREFIX/lib/python2.7/site-packages/numpy/core/include \&lt;br /&gt;
 -DRDK_BUILD_INCHI_SUPPORT=ON \&lt;br /&gt;
 -DINCHI_LIBRARY=../External/INCHI-API \&lt;br /&gt;
 -DINCHI_INCLUDE_DIR=../External/INCHI-API/src \&lt;br /&gt;
 -DCMAKE_INSTALL_PREFIX=$PYTHON_PREFIX/local/rdkit-2013.09 \&lt;br /&gt;
&lt;br /&gt;
* 6) Build RDKit: make -j4 (build in parallel) OR make (build serial)&lt;br /&gt;
* 7) Wait for a while&lt;br /&gt;
* 8) Install RDKit: make install&lt;br /&gt;
* 9) Run python install: cd $RDKIT_SRC; $PYTHON_ROOT/bin/python setup.py install&lt;br /&gt;
* 10) OK so the install doesn&#039;t actually work: do it manually: &lt;br /&gt;
&lt;br /&gt;
 cd $RDKIT_SRC&lt;br /&gt;
 mkdir $PYTHON_PREFIX/local/rdkit-2013.09&lt;br /&gt;
 cp -rv lib $PYTHON_PREFIX/local/rdkit-2013.09/lib&lt;br /&gt;
 cp -rv rdkit $PYTHON_PREFIX/local/rdkit-2013.09/rdkit&lt;br /&gt;
&lt;br /&gt;
* 11) Update the symlinks: ln -svfn $PYTHON_PREFIX/local/rdkit-2013.09 $PYTHON_PREFIX/local/rdkit&lt;br /&gt;
&lt;br /&gt;
This process works but of course doesn&#039;t cover all of the details of potential installations. For one, this doesn&#039;t even touch on the upgrading of the PostgreSQL cartridge, which is a nasty black art. I&#039;ll explain that once I have enough goats to sacrifice.&lt;br /&gt;
&lt;br /&gt;
[[Category:Sysadmin]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9998</id>
		<title>Updating Rdkit</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9998"/>
		<updated>2017-04-10T18:21:50Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Download RDKit Tarbal&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Create a python virtual env:&lt;br /&gt;
&lt;br /&gt;
    /nfs/soft/python/install/scripts/create-virtualenv.sh -s /nfs/soft/python/versions/python-2.7.11 /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
Install the new version of RDKIT in the python env:&lt;br /&gt;
&lt;br /&gt;
   /nfs/soft/python/install/scripts/install-rdkit.sh /nfs/soft/python/install/extra/RDKit_2016_03_1.tgz /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Notes =  See [[RDKit]]&lt;br /&gt;
&lt;br /&gt;
Each version is now installed in $PYTHON_ROOT/local/rdkit-VERSION. The current version is a symlink now named $PYTHON_ROOT/local/rdkit.&lt;br /&gt;
&lt;br /&gt;
It&#039;s worth noting how this works. There is a directory in $PYTHON_ROOT named $PYTHON_ROOT/local. In addition to containing the build of RDKit as described above ($PYTHONR_ROOT/local/rdkit), it also contains $PYTHON_ROOT/lib, which is simply a directory of simlinks to important shared libraries. &lt;br /&gt;
&lt;br /&gt;
When compiling Python, set LD_RUN_PATH to $ORIGIN/../lib:$ORIGIN/../local/lib so that it always looks in those two directories ($ORIGIN absolute path of the python binary). This way python can find important shared libraries, such as RDKit, modeller, etc. When we create a new virtualenv, we create a symlink to the original local directory in the virtualenv, so that we can always find the right shared libraries.&lt;br /&gt;
&lt;br /&gt;
To make sure we always have the right python package installed there is a file created in:&lt;br /&gt;
$PYTHON_ROOT/lib/python2.7/site-pacakges/rdkit.pth&lt;br /&gt;
In Python package resolution, .pth files allow you to specify an additional location to look for a module. This file includes one line:&lt;br /&gt;
../../../local/rdkit&lt;br /&gt;
Which looks for the rdkit package in the (symlink to) the current version of RDKit. Note that the RDkit python package has the path:&lt;br /&gt;
$PYTHON_ROOT/local/rdkit/rdkit (two rdkits)&lt;br /&gt;
&lt;br /&gt;
PTH files notes:&lt;br /&gt;
There are actually two other pth files I have created: modeller.pth and hask.pth&lt;br /&gt;
It would probably be best to simply create one pth file to handle these packages stored in the &amp;quot;local&amp;quot; directory:&lt;br /&gt;
Create a directory called local/packages or something and put symlinks to rdkit, haks, modeller, etc. in there. then create a single pth file with the line:&lt;br /&gt;
../../../local/packages&lt;br /&gt;
&lt;br /&gt;
= Install procedure =&lt;br /&gt;
* 0) Have a Python installation you wish to install for (This should be the base installation for virtualenvs)&lt;br /&gt;
* 1) Download RDKit Tarball&lt;br /&gt;
* 2) Extract RDKit Tarball: $RDKIT_SRC&lt;br /&gt;
* 3) Download INCHI support: cd $RDKIT_SRC/External/INCHI-API; ./download-inchi.sh&lt;br /&gt;
* 4) Create build directory: cd $RDKIT_SRC; mkdir build; cd build&lt;br /&gt;
* 5) Configure with cmake (This is a long command! and note the version specific directories/files)&lt;br /&gt;
&lt;br /&gt;
 cmake \&lt;br /&gt;
 -DPYTHON_EXECUTABLE=$PYTHON_PREFIX/bin/python \&lt;br /&gt;
 -DPYTHON_INCLUDE_PATH=$PYTHON_PREFIX/include/python2.7 \&lt;br /&gt;
 -DPYTHON_LIBRARY=$PYTHON_PREFIX/lib/libpython2.7.so \&lt;br /&gt;
 -DPYTHON_NUMPY_INCLUDE_PATH=$PYTHON_PREFIX/lib/python2.7/site-packages/numpy/core/include \&lt;br /&gt;
 -DRDK_BUILD_INCHI_SUPPORT=ON \&lt;br /&gt;
 -DINCHI_LIBRARY=../External/INCHI-API \&lt;br /&gt;
 -DINCHI_INCLUDE_DIR=../External/INCHI-API/src \&lt;br /&gt;
 -DCMAKE_INSTALL_PREFIX=$PYTHON_PREFIX/local/rdkit-2013.09 \&lt;br /&gt;
&lt;br /&gt;
* 6) Build RDKit: make -j4 (build in parallel) OR make (build serial)&lt;br /&gt;
* 7) Wait for a while&lt;br /&gt;
* 8) Install RDKit: make install&lt;br /&gt;
* 9) Run python install: cd $RDKIT_SRC; $PYTHON_ROOT/bin/python setup.py install&lt;br /&gt;
* 10) OK so the install doesn&#039;t actually work: do it manually: &lt;br /&gt;
&lt;br /&gt;
 cd $RDKIT_SRC&lt;br /&gt;
 mkdir $PYTHON_PREFIX/local/rdkit-2013.09&lt;br /&gt;
 cp -rv lib $PYTHON_PREFIX/local/rdkit-2013.09/lib&lt;br /&gt;
 cp -rv rdkit $PYTHON_PREFIX/local/rdkit-2013.09/rdkit&lt;br /&gt;
&lt;br /&gt;
* 11) Update the symlinks: ln -svfn $PYTHON_PREFIX/local/rdkit-2013.09 $PYTHON_PREFIX/local/rdkit&lt;br /&gt;
&lt;br /&gt;
This process works but of course doesn&#039;t cover all of the details of potential installations. For one, this doesn&#039;t even touch on the upgrading of the PostgreSQL cartridge, which is a nasty black art. I&#039;ll explain that once I have enough goats to sacrifice.&lt;br /&gt;
&lt;br /&gt;
[[Category:Sysadmin]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9997</id>
		<title>Updating Rdkit</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Updating_Rdkit&amp;diff=9997"/>
		<updated>2017-04-10T18:08:27Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Create a python virtual env:&lt;br /&gt;
&lt;br /&gt;
    /nfs/soft/python/install/scripts/create-virtualenv.sh -s /nfs/soft/python/versions/python-2.7.11 /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
Install the new version of RDKIT in the python env:&lt;br /&gt;
&lt;br /&gt;
   /nfs/soft/python/install/scripts/install-rdkit.sh /nfs/soft/python/install/extra/RDKit_2016_03_1.tgz /nfs/soft/python/envs/rdkit/python-2.7.11-rdkit-2016_03_01&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Notes =  See [[RDKit]]&lt;br /&gt;
&lt;br /&gt;
Each version is now installed in $PYTHON_ROOT/local/rdkit-VERSION. The current version is a symlink now named $PYTHON_ROOT/local/rdkit.&lt;br /&gt;
&lt;br /&gt;
It&#039;s worth noting how this works. There is a directory in $PYTHON_ROOT named $PYTHON_ROOT/local. In addition to containing the build of RDKit as described above ($PYTHONR_ROOT/local/rdkit), it also contains $PYTHON_ROOT/lib, which is simply a directory of simlinks to important shared libraries. &lt;br /&gt;
&lt;br /&gt;
When compiling Python, set LD_RUN_PATH to $ORIGIN/../lib:$ORIGIN/../local/lib so that it always looks in those two directories ($ORIGIN absolute path of the python binary). This way python can find important shared libraries, such as RDKit, modeller, etc. When we create a new virtualenv, we create a symlink to the original local directory in the virtualenv, so that we can always find the right shared libraries.&lt;br /&gt;
&lt;br /&gt;
To make sure we always have the right python package installed there is a file created in:&lt;br /&gt;
$PYTHON_ROOT/lib/python2.7/site-pacakges/rdkit.pth&lt;br /&gt;
In Python package resolution, .pth files allow you to specify an additional location to look for a module. This file includes one line:&lt;br /&gt;
../../../local/rdkit&lt;br /&gt;
Which looks for the rdkit package in the (symlink to) the current version of RDKit. Note that the RDkit python package has the path:&lt;br /&gt;
$PYTHON_ROOT/local/rdkit/rdkit (two rdkits)&lt;br /&gt;
&lt;br /&gt;
PTH files notes:&lt;br /&gt;
There are actually two other pth files I have created: modeller.pth and hask.pth&lt;br /&gt;
It would probably be best to simply create one pth file to handle these packages stored in the &amp;quot;local&amp;quot; directory:&lt;br /&gt;
Create a directory called local/packages or something and put symlinks to rdkit, haks, modeller, etc. in there. then create a single pth file with the line:&lt;br /&gt;
../../../local/packages&lt;br /&gt;
&lt;br /&gt;
= Install procedure =&lt;br /&gt;
* 0) Have a Python installation you wish to install for (This should be the base installation for virtualenvs)&lt;br /&gt;
* 1) Download RDKit Tarball&lt;br /&gt;
* 2) Extract RDKit Tarball: $RDKIT_SRC&lt;br /&gt;
* 3) Download INCHI support: cd $RDKIT_SRC/External/INCHI-API; ./download-inchi.sh&lt;br /&gt;
* 4) Create build directory: cd $RDKIT_SRC; mkdir build; cd build&lt;br /&gt;
* 5) Configure with cmake (This is a long command! and note the version specific directories/files)&lt;br /&gt;
&lt;br /&gt;
 cmake \&lt;br /&gt;
 -DPYTHON_EXECUTABLE=$PYTHON_PREFIX/bin/python \&lt;br /&gt;
 -DPYTHON_INCLUDE_PATH=$PYTHON_PREFIX/include/python2.7 \&lt;br /&gt;
 -DPYTHON_LIBRARY=$PYTHON_PREFIX/lib/libpython2.7.so \&lt;br /&gt;
 -DPYTHON_NUMPY_INCLUDE_PATH=$PYTHON_PREFIX/lib/python2.7/site-packages/numpy/core/include \&lt;br /&gt;
 -DRDK_BUILD_INCHI_SUPPORT=ON \&lt;br /&gt;
 -DINCHI_LIBRARY=../External/INCHI-API \&lt;br /&gt;
 -DINCHI_INCLUDE_DIR=../External/INCHI-API/src \&lt;br /&gt;
 -DCMAKE_INSTALL_PREFIX=$PYTHON_PREFIX/local/rdkit-2013.09 \&lt;br /&gt;
&lt;br /&gt;
* 6) Build RDKit: make -j4 (build in parallel) OR make (build serial)&lt;br /&gt;
* 7) Wait for a while&lt;br /&gt;
* 8) Install RDKit: make install&lt;br /&gt;
* 9) Run python install: cd $RDKIT_SRC; $PYTHON_ROOT/bin/python setup.py install&lt;br /&gt;
* 10) OK so the install doesn&#039;t actually work: do it manually: &lt;br /&gt;
&lt;br /&gt;
 cd $RDKIT_SRC&lt;br /&gt;
 mkdir $PYTHON_PREFIX/local/rdkit-2013.09&lt;br /&gt;
 cp -rv lib $PYTHON_PREFIX/local/rdkit-2013.09/lib&lt;br /&gt;
 cp -rv rdkit $PYTHON_PREFIX/local/rdkit-2013.09/rdkit&lt;br /&gt;
&lt;br /&gt;
* 11) Update the symlinks: ln -svfn $PYTHON_PREFIX/local/rdkit-2013.09 $PYTHON_PREFIX/local/rdkit&lt;br /&gt;
&lt;br /&gt;
This process works but of course doesn&#039;t cover all of the details of potential installations. For one, this doesn&#039;t even touch on the upgrading of the PostgreSQL cartridge, which is a nasty black art. I&#039;ll explain that once I have enough goats to sacrifice.&lt;br /&gt;
&lt;br /&gt;
[[Category:Sysadmin]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Adverse&amp;diff=9976</id>
		<title>Adverse</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Adverse&amp;diff=9976"/>
		<updated>2017-03-23T17:05:16Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;main web page -&amp;gt; &lt;br /&gt;
 index.html:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
splashes -&amp;gt;&lt;br /&gt;
 Images/Icons/Android&lt;br /&gt;
 Images/Icons/iOS/Splash/&lt;br /&gt;
&lt;br /&gt;
themes -&amp;gt;&lt;br /&gt;
 css files&lt;br /&gt;
&lt;br /&gt;
organ objects:&lt;br /&gt;
 int-organs.js&lt;br /&gt;
 ext-organs.js&lt;br /&gt;
 int-spots.js&lt;br /&gt;
 ext-spots.js&lt;br /&gt;
&lt;br /&gt;
body diagram interactions, functions using organ objects:&lt;br /&gt;
 interact-script.js&lt;br /&gt;
&lt;br /&gt;
objects of all adverse event terms:&lt;br /&gt;
 meddra.js&lt;br /&gt;
&lt;br /&gt;
search in openfda and draw graph with loader.js:&lt;br /&gt;
 fdajson.js &lt;br /&gt;
&lt;br /&gt;
user profile&lt;br /&gt;
 profile.js &lt;br /&gt;
&lt;br /&gt;
phonegap build config (important is to change the version number)&lt;br /&gt;
 config.xml&lt;br /&gt;
&lt;br /&gt;
how to build the app:&lt;br /&gt;
 https://build.phonegap.com/&lt;br /&gt;
   -&amp;gt; create an account&lt;br /&gt;
   -&amp;gt; zip the code folder and upload on this website&lt;br /&gt;
   -&amp;gt; .apk is for android&lt;br /&gt;
   -&amp;gt; .ipa is for ios&lt;br /&gt;
   create a special key for each OS&lt;br /&gt;
&lt;br /&gt;
how to upload the app:&lt;br /&gt;
  developers.google.com/apps&lt;br /&gt;
  itunes connect&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Adverse&amp;diff=9975</id>
		<title>Adverse</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Adverse&amp;diff=9975"/>
		<updated>2017-03-23T16:22:59Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;index.html:&lt;br /&gt;
 main web page. &lt;br /&gt;
&lt;br /&gt;
splashes -&amp;gt;&lt;br /&gt;
 Images/Icons/Android&lt;br /&gt;
 Images/Icons/iOS/Splash/&lt;br /&gt;
&lt;br /&gt;
themes -&amp;gt;&lt;br /&gt;
 css files&lt;br /&gt;
&lt;br /&gt;
organ objects:&lt;br /&gt;
 int-organs.js&lt;br /&gt;
 ext-organs.js&lt;br /&gt;
 int-spots.js&lt;br /&gt;
 ext-spots.js&lt;br /&gt;
&lt;br /&gt;
body diagram interactions, functions using organ objects:&lt;br /&gt;
 interact-script.js&lt;br /&gt;
&lt;br /&gt;
objects of all adverse event terms:&lt;br /&gt;
 meddra.js&lt;br /&gt;
&lt;br /&gt;
search in openfda and draw graph with loader.js:&lt;br /&gt;
 fdajson.js &lt;br /&gt;
&lt;br /&gt;
user profile&lt;br /&gt;
 profile.js &lt;br /&gt;
&lt;br /&gt;
phonegap build config (important is to change the version number)&lt;br /&gt;
 config.xml&lt;br /&gt;
&lt;br /&gt;
how to build the app:&lt;br /&gt;
 https://build.phonegap.com/&lt;br /&gt;
   -&amp;gt; create an account&lt;br /&gt;
   -&amp;gt; zip the code folder and upload on this website&lt;br /&gt;
   -&amp;gt; .apk is for android&lt;br /&gt;
   -&amp;gt; .ipa is for ios&lt;br /&gt;
   create a special key for each OS&lt;br /&gt;
&lt;br /&gt;
how to upload the app:&lt;br /&gt;
  developers.google.com/apps&lt;br /&gt;
  itunes connect&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Adverse&amp;diff=9973</id>
		<title>Adverse</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Adverse&amp;diff=9973"/>
		<updated>2017-03-22T22:16:51Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Created page with &amp;quot;index.html:  main web page.   splashes -&amp;gt;  Images/Icons/Android  Images/Icons/iOS/Splash/  themes -&amp;gt;  css files  organ objects:  int-organs.js  ext-organs.js  int-spots.js  ex...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;index.html:&lt;br /&gt;
 main web page. &lt;br /&gt;
&lt;br /&gt;
splashes -&amp;gt;&lt;br /&gt;
 Images/Icons/Android&lt;br /&gt;
 Images/Icons/iOS/Splash/&lt;br /&gt;
&lt;br /&gt;
themes -&amp;gt;&lt;br /&gt;
 css files&lt;br /&gt;
&lt;br /&gt;
organ objects:&lt;br /&gt;
 int-organs.js&lt;br /&gt;
 ext-organs.js&lt;br /&gt;
 int-spots.js&lt;br /&gt;
 ext-spots.js&lt;br /&gt;
&lt;br /&gt;
body diagram interactions, functions using organ objects:&lt;br /&gt;
 interact-script.js&lt;br /&gt;
&lt;br /&gt;
objects of all adverse event terms:&lt;br /&gt;
 meddra.js&lt;br /&gt;
&lt;br /&gt;
search in openfda and draw graph with loader.js:&lt;br /&gt;
 fdajson.js &lt;br /&gt;
&lt;br /&gt;
user profile&lt;br /&gt;
 profile.js &lt;br /&gt;
&lt;br /&gt;
phonegap build config (important is to change the version number)&lt;br /&gt;
 config.xml&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=DOCK3.7_INDOCK_Minimization_Parameter&amp;diff=9969</id>
		<title>DOCK3.7 INDOCK Minimization Parameter</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=DOCK3.7_INDOCK_Minimization_Parameter&amp;diff=9969"/>
		<updated>2017-03-17T20:48:59Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The best scored ligand conformation can be minimized using the following parameters.&lt;br /&gt;
&lt;br /&gt;
Add following section to existing INDOCK after SCORING section.&lt;br /&gt;
&lt;br /&gt;
 ##################################################### &lt;br /&gt;
 #                    MINIMIZATION&lt;br /&gt;
 minimize                      yes&lt;br /&gt;
 sim_itmax                     500&lt;br /&gt;
 sim_trnstep                   0.2&lt;br /&gt;
 sim_rotstep                   5.0&lt;br /&gt;
 sim_need_to_restart           1.0&lt;br /&gt;
 sim_cnvrge                    0.1&lt;br /&gt;
 min_cut                       1.0e15&lt;br /&gt;
 iseed                         777&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The default for minimize is no. Yes turns on the minimization.&lt;br /&gt;
 minimize                      yes&lt;br /&gt;
&lt;br /&gt;
How many iterations of minimization to do. More means longer run times, but potentially better poses. &lt;br /&gt;
 &lt;br /&gt;
 sim_itmax                     500&lt;br /&gt;
&lt;br /&gt;
This is the initial distance in angstroms the molecule is translated (In Angstrom). (This is used to initiliaze the simplex)&lt;br /&gt;
&lt;br /&gt;
 sim_trnstep                   0.2&lt;br /&gt;
&lt;br /&gt;
How many degrees of initial rotation are done (In Degree). (This is used to initiliaze the simplex)&lt;br /&gt;
&lt;br /&gt;
 sim_rotstep                   5.0&lt;br /&gt;
&lt;br /&gt;
If the energy changes by this much, restart the minimizer from this newest position.  Between 2 subseq runs: if energy changes by more than 1kcal restart minimizer from latest position for another round.&lt;br /&gt;
 sim_need_to_restart           1.0&lt;br /&gt;
&lt;br /&gt;
How much the total energy can changed to be considered converged. within one run: considered converged when energies of lowest and highest energy state differ by only 0.1kcal/mol. Setting this higher will stop faster, setting it lower will cause it to do more iterations before converging (or potentially hitting the iteration max above).&lt;br /&gt;
 sim_cnvrge                    0.1&lt;br /&gt;
&lt;br /&gt;
Don&#039;t minimize molecules that score above this large number. &lt;br /&gt;
 min_cut                       1.0e15&lt;br /&gt;
&lt;br /&gt;
To initiliaze the simplex, this iseed number used to generate a random number. e.g. used for trn and rotstep calc &lt;br /&gt;
 iseed                         777&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9968</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9968"/>
		<updated>2017-03-16T21:14:34Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if it not works: export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  git pull&lt;br /&gt;
  make clean&lt;br /&gt;
  delete sea related libraries from your site-packages folder under the conda env (rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver)&lt;br /&gt;
  make all&lt;br /&gt;
  kill all sea-server related processes e.g. from htop&lt;br /&gt;
  make SEAserver-start&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
  python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee$n-1-110&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if not works, export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9967</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9967"/>
		<updated>2017-03-15T16:55:26Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if not works, export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  git pull&lt;br /&gt;
  rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
  &lt;br /&gt;
    git pull&lt;br /&gt;
    make clean&lt;br /&gt;
    delete sea related libraries from your site-packages folder under the conda env&lt;br /&gt;
    make all&lt;br /&gt;
    kill all sea-server related processes e.g. from htop&lt;br /&gt;
    make SEAserver-start&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
  python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee$n-1-110&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if not works, export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9963</id>
		<title>Sea16 restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Sea16_restart&amp;diff=9963"/>
		<updated>2017-03-09T19:51:59Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# update conda_sea16&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee@gimel&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if not works, export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  git pull&lt;br /&gt;
  rm -rf /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
  python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
  ssh s_enkhee$n-1-110&lt;br /&gt;
  cd /nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/src/seaware-academic/&lt;br /&gt;
  source activate sea16 (if not works, export PATH=/nfs/soft/www/apps/sea/conda_sea16/anaconda2/envs/sea16/bin:$PATH)&lt;br /&gt;
  cd SEAsever&lt;br /&gt;
  sh scripts/run-sea-server.sh&lt;br /&gt;
  &lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
# update sea16&lt;br /&gt;
ssh xyz@gimel&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
git pull&lt;br /&gt;
rm -rf /nfs/soft/www/apps/sea/sea16/lib/python2.7/site-packages/seaserver&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic/SEAserver&lt;br /&gt;
python setup.py install&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh www@n-1-110&lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/src/seaware-academic&lt;br /&gt;
source ../../env.csh&lt;br /&gt;
cd SEAsever&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# restart server&lt;br /&gt;
&lt;br /&gt;
ssh &amp;lt;superuser&amp;gt;@n-1-110&lt;br /&gt;
sudo -i&lt;br /&gt;
screen -r&lt;br /&gt;
screen -dR Sea (switch to sea screen)&lt;br /&gt;
sh scripts/run-sea-server.sh&lt;br /&gt;
&lt;br /&gt;
# how to save the old queue data&lt;br /&gt;
 &lt;br /&gt;
cd /nfs/soft/www/apps/sea/sea16/var/seaserver/queue &lt;br /&gt;
mv jobs jobs.save&lt;br /&gt;
mv tasks.sqlite tasks.sqllite.save &lt;br /&gt;
restart sea server on n-1-110.  &lt;br /&gt;
&lt;br /&gt;
(basically, it had too much history and that was what was slowing it down)&lt;br /&gt;
(do on the first day of the month and rename the old one to a month version)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category:Curator]]&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=AWS_General_Notes&amp;diff=9936</id>
		<title>AWS General Notes</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=AWS_General_Notes&amp;diff=9936"/>
		<updated>2017-02-27T22:31:21Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: Created page with &amp;quot;&amp;#039;&amp;#039;&amp;#039;Modifying an EBS Volume from the Console:&amp;#039;&amp;#039;&amp;#039;  The following procedure shows how to apply available volume modifications from the Amazon EC2 console.      1. Open the Amazon...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Modifying an EBS Volume from the Console:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The following procedure shows how to apply available volume modifications from the Amazon EC2 console.&lt;br /&gt;
&lt;br /&gt;
    1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/.&lt;br /&gt;
&lt;br /&gt;
    2. Choose Volumes, select the volume to modify and then choose Actions, Modify Volume.&lt;br /&gt;
&lt;br /&gt;
    3. The Modify Volume window displays the volume ID and the volume&#039;s current configuration, including type, size, and IOPS. You can change any or all of these settings in a single action. Set new configuration values as follows:&lt;br /&gt;
&lt;br /&gt;
        -&amp;gt; To modify the type, choose a value for Volume Type.&lt;br /&gt;
&lt;br /&gt;
        -&amp;gt; To modify the size, enter an allowed integer value for Size.&lt;br /&gt;
&lt;br /&gt;
        -&amp;gt; If you chose Provisioned IOPS (IO1) as your volume type, enter an allowed integer value for IOPS.&lt;br /&gt;
&lt;br /&gt;
    4. After you have specified all of the modifications to apply, choose Modify, Yes. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Modifying volume size has no practical effect until you also extend the volume&#039;s file system to make use of the new storage capacity.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;To extend a Linux file system:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
1. Log in to your Linux instance using an SSH client.&lt;br /&gt;
&lt;br /&gt;
2. Use the df -h command to see the existing file system disk space usage.&lt;br /&gt;
   df -h &lt;br /&gt;
&lt;br /&gt;
3. Use a file system-specific command to resize each file system to the new volume capacity.  &lt;br /&gt;
   sudo resize2fs /dev/xvda1&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=9932</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=9932"/>
		<updated>2017-02-21T21:26:58Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Excipient Deployment:&lt;br /&gt;
&lt;br /&gt;
1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
&lt;br /&gt;
2. Create the distribution file&lt;br /&gt;
  python setup.py sdist&lt;br /&gt;
&lt;br /&gt;
3. &lt;br /&gt;
  ssh &amp;lt;super_user&amp;gt;@gimel&lt;br /&gt;
&lt;br /&gt;
4. &lt;br /&gt;
  sudo -i&lt;br /&gt;
&lt;br /&gt;
5. &lt;br /&gt;
  su - www&lt;br /&gt;
&lt;br /&gt;
4. activate the production server -&amp;gt;&lt;br /&gt;
  cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
  source bin/activate&lt;br /&gt;
&lt;br /&gt;
6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
&lt;br /&gt;
7. Run -&amp;gt; &lt;br /&gt;
  pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
&lt;br /&gt;
8. find the excipients screen -&amp;gt; &lt;br /&gt;
   screen -ls&lt;br /&gt;
&lt;br /&gt;
9. switch to excipients screen -&amp;gt; &lt;br /&gt;
   screen -dR excip&lt;br /&gt;
&lt;br /&gt;
10. CTRL-C to kill&lt;br /&gt;
&lt;br /&gt;
11. &amp;lt;UP&amp;gt; &amp;lt;ENTER&amp;gt; to rerun&lt;br /&gt;
   (this will start the gunicorn server: &lt;br /&gt;
    gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5 --timeout 1000)&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
	<entry>
		<id>http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=9923</id>
		<title>Excipient server restart</title>
		<link rel="alternate" type="text/html" href="http://wiki.docking.org/index.php?title=Excipient_server_restart&amp;diff=9923"/>
		<updated>2017-02-09T19:56:35Z</updated>

		<summary type="html">&lt;p&gt;Enkhjargal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Excipient Deployment:&lt;br /&gt;
&lt;br /&gt;
1. Go to the dev version of Excipients and set the version number set in the __init__.py file&lt;br /&gt;
&lt;br /&gt;
2. Create the distribution file&lt;br /&gt;
  python setup.py sdist&lt;br /&gt;
&lt;br /&gt;
3. &lt;br /&gt;
  ssh &amp;lt;super_user&amp;gt;@gimel&lt;br /&gt;
&lt;br /&gt;
4. &lt;br /&gt;
  sudo -i&lt;br /&gt;
&lt;br /&gt;
5. &lt;br /&gt;
  su - www&lt;br /&gt;
&lt;br /&gt;
4. activate the production server -&amp;gt;&lt;br /&gt;
  cd /nfs/soft/www/apps/excipients/envs/production/&lt;br /&gt;
  source bin/activate&lt;br /&gt;
&lt;br /&gt;
6. copy over the created dist folder in /nfs/soft/www/apps/excipients/envs/production/ &lt;br /&gt;
&lt;br /&gt;
7. Run -&amp;gt; &lt;br /&gt;
  pip install dist/CERSI-Excipients-X.Y.Z.tar.gz&lt;br /&gt;
&lt;br /&gt;
8. find the excipients screen -&amp;gt; &lt;br /&gt;
   screen -ls&lt;br /&gt;
&lt;br /&gt;
9. switch to excipients screen -&amp;gt; &lt;br /&gt;
   screen -dR excip&lt;br /&gt;
&lt;br /&gt;
10. CTRL-C to kill&lt;br /&gt;
&lt;br /&gt;
11. &amp;lt;UP&amp;gt; &amp;lt;ENTER&amp;gt; to rerun&lt;br /&gt;
   (this will start the gunicorn server: &lt;br /&gt;
    gunicorn --access-logfile excipients.ucsf.bkslab.org.acc --max-requests 1000 --bind 10.20.0.31:8093 excipients:app --workers 5)&lt;/div&gt;</summary>
		<author><name>Enkhjargal</name></author>
	</entry>
</feed>