OCS ePS results processing + AF-BLMs + test ADMs (28K/July 2022)

22/07/22

With group/loop over t-points to allow for no thresholding at AFBLM calculation + quick save routines (Pickle only - still need to debug other methods).

20/07/22

Updating from OCS_orbs8-11_AFBLMs_VM-ADMs_140122-JAKE_tidy-replot-280122.ipynb with LF results and updated ADMs.

Note LF all-orb plotting from http://jake/jupyter/user/paul/doc/tree/ePS/OCS/epsman2021/OCS_orb11_proc_131221-JAKE.ipynb - but didn't previously upload!

For methods: https://epsproc.readthedocs.io/en/dev/demos/ePSproc_class_demo_161020.html

Additional notes below

Setup

In [1]:
# Quick hack to override default HTML template
# NOT required in some JLab versions.
# https://stackoverflow.com/a/63777508
# https://stackoverflow.com/questions/21971449/how-do-i-increase-the-cell-width-of-the-jupyter-ipython-notebook-in-my-browser
from IPython.core.display import display, HTML
display(HTML("<style>.container { width:100% !important; }</style>"))
In [2]:
!hostname
jake
In [3]:
!conda env list
# conda environments:
#
base                     /home/paul/anaconda3
automation               /home/paul/anaconda3/envs/automation
baseBK-250821            /home/paul/anaconda3/envs/baseBK-250821
baseBK-280321            /home/paul/anaconda3/envs/baseBK-280321
dataTests2022            /home/paul/anaconda3/envs/dataTests2022
ePSproc-v1.2             /home/paul/anaconda3/envs/ePSproc-v1.2
ePSproc-v1.2-dev         /home/paul/anaconda3/envs/ePSproc-v1.2-dev
epsTestConda             /home/paul/anaconda3/envs/epsTestConda
epsTestPip               /home/paul/anaconda3/envs/epsTestPip
epsTestPypi              /home/paul/anaconda3/envs/epsTestPypi
epsTestSetup             /home/paul/anaconda3/envs/epsTestSetup
epsdev                *  /home/paul/anaconda3/envs/epsdev
epsdev-040821            /home/paul/anaconda3/envs/epsdev-040821
epsdev-040821-YMLnb      /home/paul/anaconda3/envs/epsdev-040821-YMLnb
epsdev-040821-YMLnb-bk     /home/paul/anaconda3/envs/epsdev-040821-YMLnb-bk
epsdev-shared-100122     /home/paul/anaconda3/envs/epsdev-shared-100122
epsdev-xr13              /home/paul/anaconda3/envs/epsdev-xr13
epsdev-xr15              /home/paul/anaconda3/envs/epsdev-xr15
epsdev-xr17              /home/paul/anaconda3/envs/epsdev-xr17
epsdevTestYML            /home/paul/anaconda3/envs/epsdevTestYML
epsman                   /home/paul/anaconda3/envs/epsman
epsman-dev-shared-310122     /home/paul/anaconda3/envs/epsman-dev-shared-310122
fibre-sim                /home/paul/anaconda3/envs/fibre-sim
frog                     /home/paul/anaconda3/envs/frog
jbookTest                /home/paul/anaconda3/envs/jbookTest
jbookTestv2              /home/paul/anaconda3/envs/jbookTestv2
matlab                   /home/paul/anaconda3/envs/matlab
qe-mini-example          /home/paul/anaconda3/envs/qe-mini-example
tmo-dev                  /home/paul/anaconda3/envs/tmo-dev

In [4]:
import sys
import os
from pathlib import Path
import numpy as np
# import epsproc as ep
import xarray as xr

import matplotlib.pyplot as plt

from datetime import datetime as dt
timeString = dt.now()

import epsproc as ep

# Plotters
import hvplot.xarray
from epsproc.plot import hvPlotters

# Multijob class dev code
from epsproc.classes.multiJob import ePSmultiJob

plotWidth = 700
hvPlotters.setPlotters(width = plotWidth, snsStyle='whitegrid')
hvPlotters.opts.defaults(hvPlotters.opts.Curve(height=plotWidth))   # Fix plot height! Currently not set by default.
* sparse not found, sparse matrix forms not available. 
* natsort not found, some sorting functions not available. 
* Setting plotter defaults with epsproc.basicPlotters.setPlotters(). Run directly to modify, or change options in local env.
* Set Holoviews with bokeh.
* pyevtk not found, VTK export not available. 
* Set Holoviews with bokeh.
In [5]:
xr.__version__
Out[5]:
'0.19.0'
In [6]:
ep.__file__
Out[6]:
'/home/paul/github/ePSproc/epsproc/__init__.py'
In [7]:
# For class, above settings don't take, not sure why, something to do with namespaces/calling sequence?
# Overriding snsStyle does work however... although NOT CONSISTENTLY????
# AH, looks like ordering matters - set_style LAST (.set seems to override)
import seaborn as sns

sns.set(rc={'figure.figsize':(10,6)})  # Set figure size in inches
sns.set_context("paper")
sns.set_style("whitegrid")  # Set plot style
sns.set_palette("Paired")   # Set colour mapping

# Try direct fig type setting for PDF output figs
from IPython.display import set_matplotlib_formats
# set_matplotlib_formats('png', 'pdf')
set_matplotlib_formats('svg', 'pdf')
In [8]:
# xr.set_options(display_style='html')

Load data

In [9]:
import warnings
# warnings.filterwarnings('once')   # Skip repeated numpy deprecation warnings in current build (xr15 env)
warnings.filterwarnings('ignore')   # Skip repeated numpy deprecation warnings in current build (xr15 env)
In [10]:
# # Scan for subdirs, based on existing routine in getFiles()

fileBase = Path('/home/paul/ePS/OCS/OCS_survey')  # Data dir on Stimpy
In [11]:
# TODO: fix orb label here, currently relies on (different) fixed format

data = ePSmultiJob(fileBase, verbose = 0)

data.scanFiles()
data.jobsSummary()
*** Warning: Missing records, expected 64, found 48.
*** Warning: Found 16 blank sets of matrix elements, symmetries ['A2']
*** Warning: Missing records, expected 64, found 48.
*** Warning: Found 16 blank sets of matrix elements, symmetries ['A2']
*** Warning: Missing records, expected 64, found 48.
*** Warning: Found 16 blank sets of matrix elements, symmetries ['A2']
*** Warning: Missing records, expected 64, found 48.
*** Warning: Found 16 blank sets of matrix elements, symmetries ['A2']
Found 4 directories, with 8 files.

*** Job orb11 details
Key: orb11
Dir /home/paul/ePS/OCS/OCS_survey/orb11, 2 file(s).
{   'batch': 'ePS OCS, batch OCS_survey, orbital orb11',
    'event': ' orb 11 ionization, basic survey run.',
    'orbE': -11.35803261907539,
    'orbLabel': '# OCS, orb 11 ionization, basic survey run.'}

*** Job orb9 details
Key: orb9
Dir /home/paul/ePS/OCS/OCS_survey/orb9, 2 file(s).
{   'batch': 'ePS OCS, batch OCS_survey, orbital orb9',
    'event': 'orb 9 (E1/P) ionization, basic survey run.',
    'orbE': -18.34863774566971,
    'orbLabel': 'E1/P'}

*** Job orb10 details
Key: orb10
Dir /home/paul/ePS/OCS/OCS_survey/orb10, 2 file(s).
{   'batch': 'ePS OCS, batch OCS_survey, orbital orb10',
    'event': 'orb 10 (A1/S) ionization, basic survey run.',
    'orbE': -17.32548962282056,
    'orbLabel': 'A1/S'}

*** Job orb8 details
Key: orb8
Dir /home/paul/ePS/OCS/OCS_survey/orb8, 2 file(s).
{   'batch': 'ePS OCS, batch OCS_survey, orbital orb8',
    'event': 'orb 8 (A1/S) ionization, basic survey run.',
    'orbE': -21.51332196607811,
    'orbLabel': 'A1/S'}
In [12]:
# Fix labels for plots - currently have some mis-named files!

# Set state labels
v = ['C','B','A','X']
sDict = {n:v[k] for k,n in enumerate(range(9,13))}

for key in data.data.keys():
    data.data[key]['jobNotes']['orbKey'] = key  # Existing orb key, from filename
    data.data[key]['jobNotes']['orbGroup'] = int(key.strip('orb')) + 1   # Corrected orb group
#     data.data[key]['jobNotes']['orbLabel'] = f"HOMO - {12 - int(key.strip('orb')) - 1}"
    data.data[key]['jobNotes']['orbLabel'] = sDict[data.data[key]['jobNotes']['orbGroup']]
    
#     print(data.data[key]['jobNotes']['orbLabel'])

System properties

Note orbital numbering in table below:

  • Orb is Gamess file output orbital numbering, but energy but not grouped by degeneracy.
  • OrbGrp is grouped numbering by degeneracy, also used by ePolyScat, and will be used for labels etc. below.

BUT - file names are 0-indexed (oops), so give OrbGrp-1 here. Sorry.

In [13]:
data.molSummary()
*** Molecular structure
2022-07-22T14:23:19.451965 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
*** Molecular orbital list (from ePS output file)
EH = Energy (Hartrees), E = Energy (eV), NOrbGrp, OrbGrp, GrpDegen = degeneracies and corresponding orbital numbering by group in ePS, NormInt = single centre expansion convergence (should be ~1.0).
props Sym EH Occ E NOrbGrp OrbGrp GrpDegen NormInt
orb
1 S -91.9901 2.0 -2503.178142 1.0 1.0 1.0 0.660569
2 S -20.6748 2.0 -562.589968 1.0 2.0 1.0 0.963189
3 S -11.4451 2.0 -311.437037 1.0 3.0 1.0 0.999998
4 S -8.9936 2.0 -244.728323 1.0 4.0 1.0 0.951187
5 S -6.6764 2.0 -181.674099 1.0 5.0 1.0 0.993503
6 P -6.6721 2.0 -181.557090 1.0 6.0 2.0 0.978550
7 P -6.6721 2.0 -181.557090 2.0 6.0 2.0 0.978550
8 S -1.5373 2.0 -41.832064 1.0 7.0 1.0 0.998241
9 S -1.0876 2.0 -29.595104 1.0 8.0 1.0 0.997329
10 S -0.7906 2.0 -21.513322 1.0 9.0 1.0 0.999269
11 P -0.6743 2.0 -18.348638 1.0 10.0 2.0 0.999887
12 P -0.6743 2.0 -18.348638 2.0 10.0 2.0 0.999887
13 S -0.6367 2.0 -17.325490 1.0 11.0 1.0 0.998566
14 P -0.4174 2.0 -11.358033 1.0 12.0 2.0 0.998839
15 P -0.4174 2.0 -11.358033 2.0 12.0 2.0 0.998839
*** Warning: some orbital convergences outside single-center expansion convergence tolerance (0.01):
[[1.         0.66056934]
 [2.         0.96318872]
 [4.         0.95118653]
 [6.         0.9785498 ]
 [7.         0.9785498 ]]

Plot cross-sections and betas

These are from ePolyScat's getCro function, and are LF (unaligned ensemble) results. This provides a good, if general, overview.

Cross-sections (all symmetries, length gauge)

In [14]:
# NEED TO SET AGAIN AFTER CLASS IMPORT!
# import warnings
# warnings.filterwarnings('once')   # Skip repeated numpy deprecation warnings in current build (xr15 env)
# warnings.filterwarnings('ignore')   # Skip repeated numpy deprecation warnings in current build (xr15 env)
In [15]:
# Comparative plot over datasets (all symmetries only)

Etype = 'Eke'  # Set for Eke or Ehv energy scale
pGauge = 'L'
Erange=[5, 20]  # Plot range (full range if not passed to function below)

# data.plotGetCroComp(pType='SIGMA', Etype = Etype, Erange = Erange, backend = 'hv')
data.plotGetCroComp(pType='SIGMA', pGauge = pGauge, Etype = Etype, Erange = Erange)
2022-07-22T14:23:19.692240 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/

LF-$\beta_{LM}$ (all symmetries, length gauge)

In [16]:
# Comparative plot over datasets (all symmetries only)
data.plotGetCroComp(pType='BETA', Etype=Etype, Erange = Erange)
2022-07-22T14:23:20.106885 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [17]:
# Possibly may want to replot with hv backend - needs some work still. Cf. XeF2 plotting
# data.plotGetCro(selDims={'Sym':'All'})    #, 'Type':'L'})  #, backend='hv')
# data.plotGetCroComp(pType='BETA', Etype=Etype, Erange = Erange, backend='hv')

AF-$\beta_{LM}$

20/07/22 PH

Revisiting with updated ADMs from Varun's fitting.

14/01/22 PH

With additional realistic ADMs from VM.

13/12/21 PH

Revisiting AF cases with (hopefully) better match to experimental alignment.

  • Add in cos^2(theta) metric.
  • Ballpark ADMs assuming Gaussian case.
  • Calculate & plot AF-BLMs for these cases.

Alignment metrics

Here we'll start with the molecular alignment defined in terms of axis distribution moments (ADMs), given by parameters $A^K_{Q,S}(t)$:

$ P(\Omega,t) = \sum_{K,Q,S} A^K_{Q,S}(t)D^K_{Q,S}(\Omega)$

Where $P(\Omega,t)$ is the full axis distribution probability, expanded for Euler angles $\Omega$.

This reduces to the 2D case if $S=0$:

$ P(\theta,\phi,t) = \sum_{K,Q} A^K_{Q,0}(t)D^K_{Q,0}(\Omega) = \sum_{K,Q} A^K_{Q}(t)Y_{K,Q}(\Omega)$

In the current code (v1.3.0, 16/08/21), the ADMs can be set directly, and expanded trivially for the 2D case (3D case to do). (Note that the 3D case is supported for AF-BLM calculations, but not yet for direct expansion & visualisation.)

Expectation values of $\langle cos^n(\theta, t)\rangle$ can be calculated directly from the $P(\theta,t)$ distributions:

$\langle cos^n(\theta, t)\rangle = \int_{\theta} cos^n(\theta)P(\theta,t) d\theta$

See, e.g.:

[1]
P. Hockett, “General phenomenology of ionization from aligned molecular ensembles,” New Journal of Physics, vol. 17, no. 2, p. 023069, Feb. 2015, doi: 10.1088/1367-2630/17/2/023069.
[2] Reid, K.L. (2018) ‘Accessing the molecular frame through strong-field alignment of distributions of gas phase molecules’, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376(2115), p. 20170158. doi:10.1098/rsta.2017.0158.

Set test ADMs (from file)

In [18]:
# Load test ADMs from .mat files.
# Code adapted from N2 case, https://epsproc.readthedocs.io/en/latest/methods/geometric_method_dev_pt3_AFBLM_090620_010920_dev_bk100920.html#Test-compared-to-experimental-N2-AF-results...
#
# To check individual file contents just use `loadmat(file)`
# 

ADMtype = 'dat'
ADMscaleFactor = 1  # Try quick SF to circumvent thresholding issues in current code - should be renormed out in final Blms
                     # Tested for 100 - still some weird stuff happening, although very different weird stuff!
                     # 10 maybe a bit better? Or 2...?
                     # May also be issue with complex form for ADMs...? Testing setting real part only later

from scipy.io import loadmat

if ADMtype == 'mat':

    # Original ADMs 14/01/22 - matlab files
    ADMdataDir = Path('~/ePS/OCS/OCS_alignment_ADMs_VM_140122')
    renorm = True  # Additional renormalisation by (2*K+1)/8*pi^2
    
else:
    # Updated ADMs from dat files
    # Note these are also Matlab hdf5, but different var labels.
    ADMdataDir = Path('~/ePS/OCS/OCS_ADMs_28K_VM_070722')
    renorm = False  # Additional renormalisation by (2*K+1)/8*pi^2
    
fList = ep.getFiles(fileBase =  ADMdataDir.expanduser(), fType='.'+ADMtype)

ADMs = []
ADMLabels = []

for f in fList:
    
    if ADMtype == 'mat':
        item = Path(f).name.rstrip('ocs.mat')  # Get & set variable name

    else:
        item = Path(f).name.split('_')[0]  # Get & set variable name
    
        
    if item == 'time':
        if ADMtype == 'mat':
            item = 'timeocs'
            
        t = loadmat(f)[item][0,:]
        
    elif item == 'c2t':
        c2t = loadmat(f)[item][:,0]
        
    else:
        # Should use re herer!
        if ADMtype == 'mat':
            K = int(item.strip('D'))
            item = item + 'ave'
        else:
#             try:
            K = int(item.strip('A')[0])
#             except:
#                 pass   # For now just skip cos^2 t term
            

#         ADMs.append(np.asarray([K,0,0,loadmat(f)[item][:,0]]))
#         ADMs.append([K,0,0,loadmat(f)[item][:,0]])
#         ADMs.append(loadmat(f)[item][:,0])
        ADMs.append(loadmat(f)[item][:,0]*ADMscaleFactor)
        ADMLabels.append([K,0,0])
        
            
            
# Add K = 0 (population) term?
# NOTE - may need additional renorm?
addPop = True
if addPop:
    if renorm:
        ADMs.append(np.ones(t.size) * np.sqrt(4*np.pi))
    else:
#         ADMs.append(np.ones(t.size))  # * np.sqrt(4*np.pi))
        ADMs.append(np.ones(t.size) * 1/(8*np.pi**2))
        
    ADMLabels.append([0,0,0])
        
ADMLabels = np.array(ADMLabels)
*** Scanning dir
/home/paul/ePS/OCS/OCS_ADMs_28K_VM_070722
Found 5 .dat file(s)

In [19]:
1/(8*np.pi**2)
Out[19]:
0.012665147955292222
In [20]:
# Test files

# fList
# loadmat(fList[3])['c2t'][:,0]
In [21]:
# Set ADMs for increasing alignment...
# tPoints = 10
# inputADMs = np.asarray([[0,0,0, *np.ones(10)], [2,0,0, *np.linspace(0,1,tPoints)], [4,0,0, *np.linspace(0,0.5,tPoints)]])
# inputADMs = [[0,0,0, *np.ones(10) * np.sqrt(4*np.pi)], 
#              [2,0,0, *np.linspace(1,2,tPoints)], 
#              [4,0,0, *np.linspace(0.3,2.8,tPoints)],
#              [6,0,0, *np.linspace(0,3,tPoints)],
#              [8,0,0, *np.linspace(0,3,tPoints)]]

# Ballpark ADMs from Fig. 1 in [2]
# inputADMs = [[0,0,0, *np.ones(10) * np.sqrt(4*np.pi)], 
#              [2,0,0, *np.linspace(0.3,2.2,tPoints)], 
#              [4,0,0, *np.linspace(0,1.5,tPoints)],
#              [6,0,0, *np.linspace(0,1.5,tPoints)],
#              [8,0,0, *np.linspace(0,1.5,tPoints)]]


# warnings.filterwarnings('ignore')   # Skip repeated numpy deprecation warnings in current build (xr15 env)

# For manual sets
# ADMs = ep.setADMs(ADMs = inputADMs)  # TODO WRAP TO CLASS (IF NOT ALREADY!)

# For ADMs from VM - set to Xarray using setADMs() routine
ADMs = ep.setADMs(ADMs = ADMs, KQSLabels = ADMLabels, t=t)  # TODO WRAP TO CLASS (IF NOT ALREADY!)
attrCpy = ADMs.attrs

if renorm:
    ADMs = ADMs * (2*ADMs.K+1)/(8*np.pi**2)  # Additional renormalisation by (2*K+1)/8*pi^2

# Set to separate dataset for testing
ADMs.attrs = attrCpy  # Propagate attrs
ADMs.attrs['jobLabel'] = 'Alignment data testing'
data.data['ADM'] = {'ADM': ADMs}

ADMs
# ep.lmPlot(ADMs, xDim='t');
Out[21]:
<xarray.DataArray 'ADM' (ADM: 4, t: 1775)>
array([[-5.80459547e-18+0.00000000e+00j,  2.63378028e-06-1.25495050e-20j,
         1.37709790e-05-9.19819503e-21j, ...,
         1.12561226e-03+4.84574136e-20j,  1.12714584e-03+1.11353461e-19j,
         1.12868145e-03+1.53291402e-19j],
       [-1.29688712e-16+0.00000000e+00j,  4.00254508e-07+2.92449521e-20j,
        -6.00749358e-07-1.98638089e-19j, ...,
         1.28722813e-03-6.25514671e-19j,  1.28846006e-03-7.25436870e-19j,
         1.28969079e-03-1.59287749e-18j],
       [ 8.04681155e-16+0.00000000e+00j, -4.08078122e-07+4.30320074e-19j,
         5.67772384e-07+9.96767522e-19j, ...,
         1.06504993e-04+2.81117260e-18j,  1.06639451e-04+2.19612106e-18j,
         1.06773647e-04+5.01793120e-18j],
       [ 1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j, ...,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j]])
Coordinates:
  * ADM      (ADM) MultiIndex
  - K        (ADM) int64 2 4 6 0
  - Q        (ADM) int64 0 0 0 0
  - S        (ADM) int64 0 0 0 0
  * t        (t) float64 0.0 0.02917 0.05833 0.0875 ... 85.03 85.03 85.03 85.03
Attributes:
    dataType:   ADM
    long_name:  Axis distribution moments
    units:      arb
    jobLabel:   Alignment data testing
In [22]:
ADMs.unstack().squeeze().real.hvplot.line(x='t').overlay('K')
# ADMs.unstack().squeeze().pipe(np.abs).hvplot.line(x='t').overlay('K')
Out[22]:
In [23]:
# Selection & downsampling - adapted from https://epsproc.readthedocs.io/en/latest/methods/geometric_method_dev_pt3_AFBLM_090620_010920_dev_bk100920.html#Test-compared-to-experimental-N2-AF-results...
# See PEMtk for updated routines, https://pemtk.readthedocs.io/en/latest/fitting/PEMtk_fitting_basic_demo_030621-full.html#Subselect-data
# See Xarray docs for basics https://xarray.pydata.org/en/stable/user-guide/indexing.html#indexing-with-dimension-names

trange=[38, 44]  # Set range in ps for calc
tStep=2  # Set tStep for downsampling

# SLICE version - was working, but not working July 2022, not sure if it's data types or Xarray version issue? Just get KeyErrors on slice.
# data.data['ADM'] = {'ADM': ADMs.sel(t=slice(trange[0],trange[1], tStep))}   # Set and update

# Inds/mask version - seems more robust?
tMask = (ADMs.t>trange[0]) & (ADMs.t<trange[1])
# ind = np.nonzero(tMask)  #[0::tStep]
# At = ADMs['time'][:,ind].squeeze()
# ADMin = ADMs['ADM'][:,ind]

data.data['ADM'] = {'ADM': ADMs[:,tMask][:,::tStep]}   # Set and update


print(f"Selecting {data.data['ADM']['ADM'].t.size} points")

# plt.plot(At, np.real(ADMin.T))
# plt.legend(ADMs['ADMlist'])
# plt.xlabel('t/ps')
# plt.ylabel('ADM')
# plt.show()

# # Set in Xarray
# ADMX = ep.setADMs(ADMs = ADMs['ADM'][:,ind], t=At, KQSLabels = ADMs['ADMlist'], addS = True)
# # ADMX

data.data['ADM']['ADM'].unstack().squeeze().real.hvplot.line(x='t').overlay('K')
Selecting 61 points
Out[23]:
In [24]:
# Check imag values - not sure if these are causing issues later?
data.data['ADM']['ADM'].unstack().squeeze().imag.hvplot.line(x='t').overlay('K')
Out[24]:
In [25]:
data.data['ADM']['ADM']
Out[25]:
<xarray.DataArray 'ADM' (ADM: 4, t: 61)>
array([[ 2.17231103e-03-8.58189774e-20j,  2.07978422e-03+4.30494130e-20j,
         1.96760536e-03-1.77957119e-19j,  1.83206183e-03-3.17304358e-21j,
         1.66967682e-03-3.61388138e-20j,  1.47780422e-03+1.14371551e-19j,
         1.25544314e-03-5.35693112e-20j,  1.00426399e-03-2.38663075e-19j,
         7.29799307e-04+2.72356306e-20j,  4.42703495e-04-1.16930710e-19j,
         1.59930576e-04+3.81512327e-20j, -9.43732733e-05+1.20489368e-19j,
        -2.88505883e-04+1.86556764e-21j, -3.83632485e-04+2.04246439e-19j,
        -3.35232897e-04-2.67746008e-20j, -9.57405048e-05+2.86520802e-19j,
         3.81578145e-04+1.46229253e-20j,  1.13761982e-03-1.20142991e-19j,
         2.20163150e-03-6.33436470e-20j,  3.58510751e-03-7.41908781e-22j,
         5.27609701e-03+9.12233667e-21j,  7.23474082e-03+5.97164398e-20j,
         9.39087819e-03-2.95049386e-21j,  1.16444637e-02-1.16939660e-19j,
         1.38693012e-02+1.86103282e-21j,  1.59202492e-02+7.71798557e-21j,
         1.76436113e-02-1.61868264e-20j,  1.88899586e-02-2.03205660e-19j,
         1.95281997e-02+1.18662764e-19j,  1.94594002e-02-6.69460708e-20j,
         1.86287137e-02-2.60630926e-19j,  1.70338647e-02-1.67715542e-19j,
         1.47289332e-02-2.89975202e-19j,  1.18226921e-02-9.88259219e-20j,
         8.47139770e-03+8.64559987e-20j,  4.86661931e-03+1.60492731e-19j,
         1.21933519e-03-7.73481448e-20j, -2.25799150e-03+3.75099950e-20j,
        -5.36938160e-03-2.49832489e-19j, -7.95138129e-03+2.26525887e-19j,
...
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j,  1.26651480e-02+0.00000000e+00j,
         1.26651480e-02+0.00000000e+00j]])
Coordinates:
  * ADM      (ADM) MultiIndex
  - K        (ADM) int64 2 4 6 0
  - Q        (ADM) int64 0 0 0 0
  - S        (ADM) int64 0 0 0 0
  * t        (t) float64 38.01 38.11 38.21 38.31 ... 43.67 43.77 43.87 43.97
Attributes:
    dataType:   ADM
    long_name:  Axis distribution moments
    units:      arb
    jobLabel:   Alignment data testing

Compute $P(\theta)$ distribtuions + metrics

In [26]:
# Axis distributions P(theta) vs. t (summed over phi)
# Pt = data.padPlot(keys = 'ADM', dataType='ADM', Etype = 't', pStyle='grid', reducePhi='sum', returnFlag = True)  #, facetDims=['t', 'Eke'])
Pt = data.padPlot(keys = 'ADM', dataType='ADM', Etype = 't', pType='r', pStyle='grid', reducePhi='sum', returnFlag = True)  #, facetDims=['t', 'Eke'])  # Real plot to check values OK.

# 22/07/22 - NOTICE SOME -ve VALUES HERE WITH Re plot and updated ADMs. Issue with renorm somewhere still?
Using default sph betas.
Summing over dims: set()
Plotting from self.data[ADM][ADM], facetDims=['Labels', 't'], pType=r with backend=mpl.
Grid plot: Alignment data testing, dataType: ADM, plotType: r
Set plot to self.data['ADM']['plots']['ADM']['grid']
2022-07-22T14:23:21.549700 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [27]:
# The returned data array contains the plotted values
Pt = data.data['ADM']['plots']['ADM']['pData']
Pt
Out[27]:
<xarray.DataArray (t: 61, Theta: 50)>
array([[0.36062076, 0.35777445, 0.34939939, ..., 0.34939939, 0.35777445,
        0.36062076],
       [0.35829998, 0.35546011, 0.34710468, ..., 0.34710468, 0.35546011,
        0.35829998],
       [0.35530345, 0.35247586, 0.34415727, ..., 0.34415727, 0.35247586,
        0.35530345],
       ...,
       [0.47053593, 0.46661976, 0.45508433, ..., 0.45508433, 0.46661976,
        0.47053593],
       [0.46108648, 0.45734347, 0.44631444, ..., 0.44631444, 0.45734347,
        0.46108648],
       [0.45025753, 0.44668689, 0.43616281, ..., 0.43616281, 0.44668689,
        0.45025753]])
Coordinates:
  * t        (t) float64 38.01 38.11 38.21 38.31 ... 43.67 43.77 43.87 43.97
  * Theta    (Theta) float64 0.0 0.06411 0.1282 0.1923 ... 3.013 3.077 3.142
Attributes:
    dataType:      ADM
    long_name:     Axis distribution moments
    units:         arb
    jobLabel:      Alignment data testing
    pType:         r
    pTypeDetails:  {'Type': 'real', 'Formula': 'np.real(dataPlot)', 'Exec': <...
In [28]:
# Check no negative values...
print(Pt.min() > 0)
<xarray.DataArray ()>
array(False)
In [29]:
def computeCNs(Pt, N = None, Nrange = [0,10], Nstep = 2):
    """
    Function to compute <cos^N> expectation values from P(theta) distributions.
    
    NOTE: currently assumes 1D case. Should generalise!
    
    """
    
    if N is None:
        N = np.arange(Nrange[0], Nrange[1], Nstep)
    
    # Generate cos^N basis
    #  cn = xr.ones_like(Pt) * (np.cos(Pt.Theta)**N)  # Use multiplication to keep t coord - this fails for array-like N with broadcast errors
    cn = xr.ones_like(Pt) * np.cos(Pt.Theta).expand_dims({'N':N}).T.pipe(np.power,N)  # OK with transpose
    
    Jt = np.sin(Pt.Theta) # Jacobian
    
#     norm = (Pt).sum('Theta')   # * np.sqrt(2)
    norm = (Pt * Jt).sum('Theta') # * (4*np.pi)   # * np.sqrt(2)
    print(norm)
    cnMetric = (Pt * cn * Jt).sum('Theta')/norm
    
#     cnMetric = Pt.dot(cn, dims='Theta') # Multiply and sum over Theta - dot version (no Jacobian or renorm)
#     cnMetric = (Pt.dot(cn * Jt, dims='Theta')/Pt.dot(Jt, dims='Theta'))  # Dot version with renorm
    
    return cnMetric
In [30]:
# Compute and plot for a range of N
PtNorm = Pt  #/Pt.sum('Theta')  # Norm integral to 1 - now set in function
cnMetric = computeCNs(PtNorm)

# cnMetric.plot()  # Basic plot
# sf = np.sqrt(2)  # Seem to need additional sqrt(2) normalisation? Probably SPH definition and/or integration domain and/or distribution generation params.
sf = 1
ep.lmPlot(cnMetric*sf, xDim='t');  # With lmPlot()
<xarray.DataArray (t: 61)>
array([5.568662  , 5.56868681, 5.56871884, 5.56876007, 5.56881259,
       5.56887845, 5.56895934, 5.56905616, 5.56916847, 5.56929386,
       5.56942735, 5.56956079, 5.5696826 , 5.56977789, 5.56982919,
       5.56981782, 5.56972609, 5.56953975, 5.56925011, 5.56885473,
       5.56835557, 5.56775466, 5.56704904, 5.5662276 , 5.56527185,
       5.56416181, 5.56289341, 5.56152295, 5.56024395, 5.5594455 ,
       5.55963649, 5.56116599, 5.56388424, 5.56707325, 5.56982663,
       5.57161107, 5.57250366, 5.5729224 , 5.57319494, 5.57339912,
       5.57350965, 5.5735587 , 5.57361863, 5.57369362, 5.5736859 ,
       5.57346851, 5.57297656, 5.57224017, 5.57135657, 5.57044089,
       5.56958914, 5.56886279, 5.56828987, 5.56787328, 5.56760012,
       5.56744916, 5.56739592, 5.56741585, 5.56748625, 5.56758742,
       5.56770332])
Coordinates:
  * t        (t) float64 38.01 38.11 38.21 38.31 ... 43.67 43.77 43.87 43.97
Set dataType (No dataType)
Plotting data (No filename), pType=a, thres=0.01, with Seaborn
No handles with labels found to put in legend.
2022-07-22T14:23:22.789855 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [31]:
# Seem to need additional sqrt(2) normalisation? Probably SPH definition?
# (cnMetric*np.sqrt(2)).sel(N=2).plot()
# (cnMetric*np.sqrt(2)).plot.line(x='t')

# Testing exact values
# sf = np.sqrt(2)
# sf = 2
# (cnMetric*sf).plot.line(x='t')
# (ADMs).plot.line(x='t')

(cnMetric*sf).fillna(0).hvplot.line(x='t').overlay('N')
Out[31]:
In [32]:
cnMetric
Out[32]:
<xarray.DataArray (t: 61, N: 5)>
array([[ 1.        ,  0.38404376,  0.25878646,  0.20006142,  0.16449936],
       [ 1.        ,  0.38186689,  0.25701095,  0.19862913,  0.16331169],
       [ 1.        ,  0.37922792,  0.25483726,  0.19686284,  0.16183891],
       [ 1.        ,  0.37603962,  0.25218211,  0.19468846,  0.16001532],
       [ 1.        ,  0.37222036,  0.24896411,  0.19203196,  0.15777443],
       [ 1.        ,  0.36770808,  0.24511575,  0.18882933,  0.1550573 ],
       [ 1.        ,  0.36247946,  0.24060019,  0.18504075,  0.15182469],
       [ 1.        ,  0.35657402,  0.23543282,  0.18066905,  0.14807307],
       [ 1.        ,  0.35012211,  0.22970663,  0.17578171,  0.14385382],
       [ 1.        ,  0.3433745 ,  0.2236194 ,  0.17053471,  0.13929412],
       [ 1.        ,  0.33672997,  0.21749955,  0.16519529,  0.13461727],
       [ 1.        ,  0.33075628,  0.21182605,  0.16016   ,  0.13015879],
       [ 1.        ,  0.32619864,  0.20723722,  0.15596284,  0.12637394],
       [ 1.        ,  0.32396976,  0.20452209,  0.15326815,  0.12383176],
       [ 1.        ,  0.32511589,  0.20458888,  0.15284298,  0.12319079],
       [ 1.        ,  0.33075498,  0.2084068 ,  0.1555054 ,  0.12515314],
       [ 1.        ,  0.34198585,  0.21692138,  0.16204954,  0.13039806],
       [ 1.        ,  0.35977085,  0.23094973,  0.17315486,  0.13950222],
       [ 1.        ,  0.38479947,  0.25107037,  0.18929594,  0.15286353],
       [ 1.        ,  0.41734474,  0.27752885,  0.21067706,  0.17065228],
...
       [ 1.        ,  0.07207567, -0.00959752, -0.02497041, -0.02759819],
       [ 1.        ,  0.06043892, -0.019896  , -0.03335812, -0.03444289],
       [ 1.        ,  0.06493269, -0.01910284, -0.03406416, -0.03573482],
       [ 1.        ,  0.0835703 , -0.00674792, -0.02565118, -0.0296866 ],
       [ 1.        ,  0.1135986 ,  0.01650425, -0.00766989, -0.01528345],
       [ 1.        ,  0.15182188,  0.04870846,  0.01881946,  0.0069644 ],
       [ 1.        ,  0.1949197 ,  0.08695328,  0.0514194 ,  0.03506698],
       [ 1.        ,  0.23972617,  0.12792464,  0.0870344 ,  0.06619541],
       [ 1.        ,  0.28345454,  0.16847621,  0.12259344,  0.09746027],
       [ 1.        ,  0.32385643,  0.20604051,  0.1555766 ,  0.12648277],
       [ 1.        ,  0.35930975,  0.23882561,  0.18425666,  0.15165116],
       [ 1.        ,  0.38883605,  0.26582806,  0.20770601,  0.17212302],
       [ 1.        ,  0.41205637,  0.28673012,  0.22566893,  0.18768903],
       [ 1.        ,  0.42910049,  0.30174593,  0.23838711,  0.19859508],
       [ 1.        ,  0.44048802,  0.31146021,  0.24643027,  0.20537663],
       [ 1.        ,  0.44699935,  0.31668144,  0.25055332,  0.20872488],
       [ 1.        ,  0.44955195,  0.31831833,  0.25158393,  0.20938577],
       [ 1.        ,  0.44909362,  0.3172821 ,  0.25033805,  0.20808658],
       [ 1.        ,  0.44651941,  0.31441481,  0.24756009,  0.20548522],
       [ 1.        ,  0.44261485,  0.31044264,  0.24388509,  0.20213884]])
Coordinates:
  * t        (t) float64 38.01 38.11 38.21 38.31 ... 43.67 43.77 43.87 43.97
  * N        (N) int64 0 2 4 6 8
In [33]:
1/np.sqrt(8*np.pi**2)
Out[33]:
0.11253953951963826
In [34]:
np.sqrt(np.pi)
Out[34]:
1.7724538509055159
In [35]:
1/np.sqrt(2)
Out[35]:
0.7071067811865475
In [36]:
# Plot ref cos^2 to check
# CURRENTLY NOT MATCHING REF VALUES - not sure if it's ADM renorm issue, or cos^2 calculator issue!
c2tXR = xr.DataArray(c2t, coords=[("t", t)])
c2tXR.name = 'cos2'

c2tXR.real.hvplot.line(x='t')
Out[36]:
In [37]:
c2tXR
Out[37]:
<xarray.DataArray 'cos2' (t: 1775)>
array([0.33333333+0.00000000e+00j, 0.33336106-1.32115893e-19j,
       0.33347831-9.68347159e-20j, ..., 0.3451833 +5.10139203e-19j,
       0.34519945+1.17228225e-18j, 0.34521562+1.61378720e-18j])
Coordinates:
  * t        (t) float64 0.0 0.02917 0.05833 0.0875 ... 85.03 85.03 85.03 85.03
In [38]:
# Check from ADMs

K = 2
c2ADM = (1/3) + (2/3)*(8*np.pi**2 / (2*K + 1) )* data.data['ADM']['ADM'].sel(K=2).unstack().squeeze(drop=True)
c2ADM.real.hvplot()
Out[38]:

AFBLMs for aligned case

New plotting code to update current package versions!

TODO:

  • Need to fix thresholding issues with AFBLM code, gives inconsistent results vs. t, and calc. blows up for too lower a threshold.
    • Improve routine? Early sums to reduce array size?
    • Loop over t-points.
    • Add dim checking to thresDims variable to allow for optional t-dim tests (to keep thresholding consistent).
    • Simplify thresholding, add additional thresholds, to routine? Currently thresholds multiple times with same values.
  • Could also...
    • Check renorm & masking, currently getting some singularities with renorm by 0/NaN cases probably.
In [39]:
# Compute AFBLMs for aligned case
keys = data._keysCheck(None)
keys = list(set(keys) - set(['ADM']))


#*************** Throw everything in method - breaks for large datasets (RAM issues)
# data.AFBLM(keys = keys, AKQS = ADMs, selDims = {'Type':'L'})  # NOTE THIS USES independently set ADMs!

# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=1e-3) # OK for 61 t-points (x2 downsample)
# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=1e-3, thresDims = ['Eke','t'])   # NEED TO ADD DIM CHECKING TO thresDims
# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=1e-4)   # 1e-4 working OK for 30 t-points (x4 downsample)
# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=1e-5)   # 1e-5 working OK for 16 t-points (x8 downsample)
# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=1e-6)   # 1e-6 working OK for 16 t-points (x8 downsample)
# data.AFBLM(keys = keys, AKQS = data.data['ADM']['ADM'].real, selDims = {'Type':'L'}, thres=None)   # FAILS for 16 t-points (x8 downsample) at 60Gb RAM on Jake (Swap file size issue????)
                                                                                                   # OK for 8 t-points (x16 downsample) at ~55Gb RAM peak on Jake.
# thres=1e-4)  BREAKS KERNEL #, thresDims=['Eke','t'])  # TODO: currently thersDims must be in matE only for AFBLM routine.

# 21/07/22 version - above only gives L=2 max, missing L=4,6. But bumping threshold to 1e-4 kills calculation...
# data.AFBLM(keys = keys,  selDims = {'Type':'L'})  # Default case... sets/uses ADMs per key
# UPDATE: testing with scale factor for ADMs, and forcing to real - might be better?


#*********** Using grouping to avoid memory issues 
#        QUITE slow with current code, but more robust
#        TODO: make this better with basis set recycle (cf. fitting code)
#
import pickle

def calcAFBLMGroup(data,ADMs,keys,AFBLMdict):
    """Wrap for XR groupby"""
    
    # Class routine - returns to main datastruture
    data.AFBLM(keys = keys, AKQS = ADMs, selDims = {'Type':'L'}, thres=None)  #.map(standardize)
    
    # Copy to new structure & restack
#     keys = data._keysCheck(keys)

#     AFBLMdict = {}
    # Restack data to lists for merge
    for key in keys:
        AFBLMdict[key]['AFBLMlist'].append(data.data[key]['AFBLM'].copy())
        
    return AFBLMdict
        

AFBLMdict = {k:{'AFBLMlist':[]} for k in keys}
    
# tBins = np.arange(38,39.1,0.5)   # Test range
tBins = np.arange(trange[0], trange[1] + 0.1, 0.5)
# trange

import time

for item in data.data['ADM']['ADM'].groupby_bins("t", tBins):
    start = time.time()
    print(f'Calculating for group: {item[0]}, {item[1].t.size} t-points.')
    AFBLMdict = calcAFBLMGroup(data, item[1], keys, AFBLMdict)
    end = time.time()
    print(f'Delta: {end - start}s')

# Concat XR lists
for key in keys:
    AFBLMdict[key]['full'] = xr.concat(AFBLMdict[key]['AFBLMlist'],"t")
    AFBLMdict[key]['full'].name = f'{key}_AFBLM'
    
    # Push back to main class
    data.data[key]['AFBLM'] = AFBLMdict[key]['full'].copy()
    
    # Save object with Pickle
    with open(f'OCS_{key}_AFBLM_220722.pickle', 'wb') as handle:
        print(f'Saving {key} with pickle')
        pickle.dump(AFBLMdict[key]['full'], handle, protocol=pickle.HIGHEST_PROTOCOL)
    
# Save full class with Pickle
# NOW GIVING ERRORS ON RERUN (although worked previously), AttributeError: Can't pickle local object 'plotTypeSelector.<locals>.<lambda>'
# NOT SURE WHERE THIS IS COMING FROM AS YET
# UPDATE: IN data.data['ADM']['plots']['ADM']['pData'].attrs['pTypeDetails']
# TODO: GENERAL DEBUG FOR ACCIDENTAL FUNCTION PASSING HERE!!!!
# import pickle
with open(f'OCS_AFBLM_220722.pickle', 'wb') as handle:
    data.data['ADM']['plots']['ADM']['pData'].attrs['pTypeDetails'] = []  # UGH, this breaks pickle unless removed.
    pickle.dump(data, handle, protocol=pickle.HIGHEST_PROTOCOL)
    print(f'Saved data (full class) with pickle')
Calculating for group: (38.0, 38.5], 5 t-points.
Delta: 79.47455739974976s
Calculating for group: (38.5, 39.0], 5 t-points.
Delta: 76.17200112342834s
Calculating for group: (39.0, 39.5], 5 t-points.
Delta: 76.8993592262268s
Calculating for group: (39.5, 40.0], 6 t-points.
Delta: 82.81936311721802s
Calculating for group: (40.0, 40.5], 5 t-points.
Delta: 76.35259532928467s
Calculating for group: (40.5, 41.0], 5 t-points.
Delta: 76.26187920570374s
Calculating for group: (41.0, 41.5], 5 t-points.
Delta: 76.27736186981201s
Calculating for group: (41.5, 42.0], 5 t-points.
Delta: 77.49206280708313s
Calculating for group: (42.0, 42.5], 5 t-points.
Delta: 76.72563934326172s
Calculating for group: (42.5, 43.0], 5 t-points.
Delta: 76.81541442871094s
Calculating for group: (43.0, 43.5], 5 t-points.
Delta: 76.82414698600769s
Calculating for group: (43.5, 44.0], 5 t-points.
Delta: 76.77179551124573s
Saving orb9 with pickle
Saving orb8 with pickle
Saving orb11 with pickle
Saving orb10 with pickle
Saved data (full class) with pickle
In [40]:
# for key in keys:
#     AFBLMdict[key]['full'] = xr.concat(AFBLMdict[key]['AFBLMlist'],"t")
#     AFBLMdict[key]['full'].name = f'{key}_AFBLM'
In [41]:
# with open(f'OCS_AFBLM_220722.pickle', 'wb') as handle:
#     data.data['ADM']['plots']['ADM']['pData'].attrs['pTypeDetails'] = []  # UGH, this breaks pickle unless removed.
#     pickle.dump(data, handle, protocol=pickle.HIGHEST_PROTOCOL)
#     print(f'Saved data (full class) with pickle')
In [42]:
# AFBLMdict[key]['full'] = xr.concat(AFBLMdict['orb10'], "t")
# AFBLMdict['orb10']['full'].unstack('BLM').sel({'m':0}).squeeze().real.hvplot.line(x='t')
In [43]:
# Quick stack to dict & XR dataset
dataType = 'AFBLM'
thres = None
selDims = None
Etype = 'Eke'

pDict = []

for key in keys:
#     subset = ep.matEleSelector(data.data[key][dataType], thres=thres, inds = selDims, dims = [Etype, 't'], sq = True)
    subset = ep.matEleSelector(data.data[key][dataType], thres=thres, inds = selDims, dims = 't', sq = True)
    subset.name = 'BLM'

    subset = ep.plotTypeSelector(subset, pType = 'r')

#     pDict.append(subset.expand_dims({'Orb':[key]}))
#     pDict.append(subset.expand_dims({'Orb':[data.data[key][dataType].attrs['jobLabel']]}))
    pDict.append(subset.expand_dims({'Orb':[data.data[key]['jobNotes']['orbLabel']]}))

    
#     hvDS = hv.Dataset(subset.unstack('BLM'))
    
#     display(hvDS.to(hv.Curve, kdims=Etype).overlay(['l','m']))

#     pDict[key] = hvDS.to(hv.Curve, kdims=Etype).overlay(['l','m'])
    
# hv.HoloMap(pDict, kdims = ['Orb'])   # Individual plots OK, but won't stack?

xrAF = xr.concat(pDict, 'Orb')
In [44]:
# xrAF.unstack().dropna(dim = 't', how = 'all').drop('m')  #.fillna(0)

New plots 28/01/22 - heatmaps

NOTE: use the histogram at the side to change the colour mapping scale.

For old line plots see https://phockett.github.io/ePSdata/OCS-preliminary/OCS_orbs8-11_AFBLMs_VM-ADMs_140122-JAKE_tidy.html

image.png

In [45]:
# Heatmaps for cross-sections

# Set opts to match sizes - should be able to link plots or set gridspace to handle this?
hvPlotters.setPlotDefaults(fSize = [750,300], imgSize = 600)
cmap = 'coolwarm'  # See http://holoviews.org/user_guide/Colormaps.html

# Get cross-sections
XS = np.abs(xrAF.XSrescaled)  # Should be in MB
XS.name = 'XS/MB'
hvXS = hvPlotters.hv.Dataset(XS.fillna(0))

# Plot heatmap + ADMs
(hvXS.to(hvPlotters.hv.HeatMap, kdims=['t',Etype]).opts(cmap=cmap).hist()  + data.data['ADM']['ADM'].unstack().squeeze().real.hvplot.line(x='t').overlay('K')).cols(1)
WARNING:param.HeatMapPlot11861: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
Out[45]:
In [46]:
# Heatmap for l=2,4,6

# BLM = np.abs(xrAF.XSrescaled)  # Should be in MB
# XS.name = 'XS/MB'
hvDS = hvPlotters.hv.Dataset(xrAF.unstack('BLM').fillna(0).squeeze(drop=True))

# Plot heatmap + ADMs
(hvDS.select(l=[2,4], m=0).to(hvPlotters.hv.HeatMap, kdims=['t',Etype]).opts(cmap=cmap).hist() + data.data['ADM']['ADM'].unstack().squeeze().real.hvplot.line(x='t').overlay('K')).cols(1)  #.overlay(['l','t']).layout('Orb').cols(1)
WARNING:param.HeatMapPlot20462: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
Out[46]:
In [47]:
# SUPER SLOW WITH LARGER DATASETS - must be a way to improve on this? Seems a bit better with subsets and if NaNs dropped properly, also if high-dim vars are overlaid rather than Holomapped

# hvPlotters.opts.defaults(hvPlotters.opts.Curve(height=700))
# lSel = [2,4,6]
# hvDS = hvPlotters.hv.Dataset(xrAF.unstack('BLM'))
# hvDS.select(l=lSel).to(hvPlotters.hv.Curve, kdims=Etype).overlay(['l','m'])   
# hvDS.to(hvPlotters.hv.Curve, kdims=Etype).overlay(['l','m'])


# # Try with subsets to speed up plotters
# hvDS = hvPlotters.hv.Dataset(xrAF.unstack().dropna(dim = 't', how = 'all').drop('m').fillna(0))
# # hvDS.to(hvPlotters.hv.Curve, kdims=Etype).overlay(['l','m'])

# # Full set of line plots
# hvDS.to(hvPlotters.hv.Curve, kdims=Etype).overlay(['l'])
In [ ]:
 

Energy selected slices to try to match experimental data

Plots need some work still!

In [48]:
# Try to match expt....

Estates = {'X': (11.9,12.6),
           'A': (7.5,8.7),
           'B': (7.5,8.7),
           'C': (5.5,6.2)}
In [61]:
# For different energy slices need to subselect & restack? May be a better way... Probably with hv.transforms?
hvObjsXS = {}
hvObjs = {}
for k,v in Estates.items():
    print(k,v)
#     hvObjs[k] = hvDS.select(l=[2,4,6], Orb=k, Eke=v).to(hvPlotters.hv.Curve, kdims='t')  #.overlay(['l'])   # Not sure how to drop Edim here...
#     hvObjs[k] = xrAF.unstack().sel(l=[2,4,6], Orb=k, Eke=slice(v[0],v[1])).sum('Eke').unstack().drop('m').real.fillna(0).hvplot.line(x='t')        #.dropna(dim = 't', how = 'all').drop('m').hvplot.line(x='t')
#     hvObjs[k] = xrAF.unstack().sel(l=[2,4], Orb=k, Eke=slice(v[0],v[1])).sum('Eke').unstack().drop('m').real.fillna(0).hvplot.line(x='t')        #.dropna(dim = 't', how = 'all').drop('m').hvplot.line(x='t')
    hvObjs[k] = xrAF.unstack().sel(l=[2,4,6,8], m=0, Orb=k, Eke=slice(v[0],v[1])).sum('Eke').drop('m').real.hvplot.line(x='t').overlay('l')    # FOR no threshold case also need m=0 for contiguous plots!
#     hvObjs[k] = xrAF.unstack().sel(l=[2,4], Orb=k, Eke=slice(v[0],v[1])).sum('Eke').unstack().drop('m').pipe(np.abs).fillna(0).hvplot.line(x='t')        #.dropna(dim = 't', how = 'all').drop('m').hvplot.line(x='t')
    hvObjsXS[k] = XS.sel(Orb=k, Eke=slice(v[0],v[1])).sum('Eke').fillna(0).hvplot.line(x='t')

# Stack - should be able to use HoloMap, but not working so far
# for k,v in Estates.items():
    
X (11.9, 12.6)
A (7.5, 8.7)
B (7.5, 8.7)
C (5.5, 6.2)
In [62]:
# Cross-sections for E slices in Orb order
# (hvObjsXS['X'] + hvObjsXS['A'] + hvObjsXS['B'] + hvObjsXS['C']).opts(shared_axes=False).cols(1)   #.opts(opts.Cuver(axiswise=True))

(hvObjsXS['X'] + hvObjsXS['A'] + hvObjsXS['B'] + hvObjsXS['C']).cols(1)

# layout = []
# for k in Estates.keys():
#     layout.append(hvObjsXS[k].overlay('l'))
Out[62]:
In [63]:
# BLM for E slices in Orb order
(hvObjs['X'] + hvObjs['A'] + hvObjs['B'] + hvObjs['C']).cols(1)  #.opts(shared_axes=False).cols(1)
Out[63]:
In [52]:
# hvPlotters.hv.HoloMap(hvObjs, kdims=['l'], group='Orb')  #.relabel(group='Phases', label='Sines', depth=1)
In [53]:
# hvDS.select(l=[2,4,6], Orb=k, Eke=v)

Versions

Code

In [54]:
# print(data.jobInfo['ePolyScat'][0])
# print('Run: ' + jobInfo['Starting'][0].split('at')[1])
In [55]:
import scooby
scooby.Report(additional=['epsproc', 'xarray', 'jupyter', 'holoviews', 'pandas'])
Out[55]:
Fri Jul 22 14:38:59 2022 EDT
OS Linux CPU(s) 64 Machine x86_64
Architecture 64bit Environment Jupyter
Python 3.7.10 (default, Feb 26 2021, 18:47:35) [GCC 7.3.0]
epsproc 1.3.2-dev xarray 0.19.0 jupyter Version unknown
holoviews 1.14.2 pandas 1.2.4 numpy 1.20.1
scipy 1.6.1 IPython 7.21.0 matplotlib 3.3.4
scooby 0.5.6
In [56]:
# Check current Git commit for local ePSproc version
!git -C {Path(ep.__file__).parent} branch
!git -C {Path(ep.__file__).parent} log --format="%H" -n 1
* dev
  master
dc91f32d5461d784b222f939aceeb8e0c694a533
In [57]:
# Check current remote commits
!git ls-remote --heads https://github.com/phockett/ePSproc
dc91f32d5461d784b222f939aceeb8e0c694a533	refs/heads/dev
54b929025381f1c2cbf371180fa870f247e7f627	refs/heads/master
69cd89ce5bc0ad6d465a4bd8df6fba15d3fd1aee	refs/heads/numba-tests
ea30878c842f09d525fbf39fa269fa2302a13b57	refs/heads/revert-9-master

Notes

20/07/22

Updating from OCS_orbs8-11_AFBLMs_VM-ADMs_140122-JAKE_tidy-replot-280122.ipynb with LF results and updated ADMs.

Note LF all-orb plotting from http://jake/jupyter/user/paul/doc/tree/ePS/OCS/epsman2021/OCS_orb11_proc_131221-JAKE.ipynb - but didn't previously upload!

28/01/22

Adding additional plot types & energy slices to match expt.

14/01/22

Quick attempt at AF-BLMs for OCS, based on guessed/reasonable ADMs - action starts below after setup stage). Status: basically working, may be some remaining bugs with ADMs, cos^2 and renormalisation, but trends with alignment should be correct.

For methods: https://epsproc.readthedocs.io/en/dev/demos/ePSproc_class_demo_161020.html

14/01/22 PH

Rerunning with some additional test ADMs from Varun Makhija.

07/01/22 PH

  • Fixed/tested cos^2 values. (Maybe - now matching Reid 2018, but may not be correct in all cases?)
  • Updated plotting with HV for BLMs (in dev stage).
  • Checked XS values too - may have some -ve cases/phase issues to sort? (Cf. general AF code testing notes elsewhere.)

Versions

13/12/21 PH

STATUS: basics in place, need to trim code and fix a few things still.

Revisiting AF cases with (hopefully) better match to experimental alignment.

  • Add in cos^2(theta) metric.
  • Ballpark ADMs assuming Gaussian case.

TODO:

FIXED:

  • Note 'it' and 'labels' handling changed in recent versions of ePSproc, and degenerate dim hanling now correct!

09/06/21 PH

Quick look at initial ePS results for orb 11 (HOMO), 2.5eV step size.

TODO:

  • Currently AF code fails for Xarray = 0.17, issue with phaseCons setting to coord, sigh.
  • AF normalisation not correct here, due to doubly-degenerate state? Summing over 'it' not working correctly either, needs a debug (or missing some settings...?).
  • lmPlot() slice and cmapping to fix, in cases of missing/dropped dims, and for clims if possible.

For methods: https://epsproc.readthedocs.io/en/dev/demos/ePSproc_class_demo_161020.html

In [ ]: