1
0
mirror of https://github.com/deadc0de6/dotdrop.git synced 2026-03-23 00:15:08 +00:00

Merge branch 'parsing-refactoring'

This commit is contained in:
deadc0de6
2019-06-15 17:50:09 +02:00
60 changed files with 4087 additions and 1639 deletions

View File

@@ -1,6 +1,5 @@
language: python
python:
- "3.4"
- "3.5"
- "3.6"
- "3.7"

View File

@@ -15,8 +15,10 @@ Dotdrop's code base is located in the [dotdrop directory](/dotdrop).
Here's an overview of the different files and their role:
* **action.py**: represent the actions and transformations
* **cfg_yaml.py**: the lower level config parser
* **cfg_aggregator.py**: the higher level config parser
* **comparator.py**: the class handling the comparison for `compare`
* **config.py**: the config file (*config.yaml*) parser
* **dictparser.py**: abstract class for parsing dictionaries
* **dotdrop.py**: the entry point and where the different cli commands are executed
* **dotfile.py**: represent a dotfile
* **installer.py**: the class handling the installation of dotfile for `install`
@@ -24,10 +26,54 @@ Here's an overview of the different files and their role:
* **linktypes.py**: enum for the three types of linking (none, symlink, children)
* **logger.py**: the custom logger
* **options.py**: the class embedding all the different options across dotdrop
* **profile.py**: represent a profile
* **settings.py**: represent the config settings
* **templategen.py**: the jinja2 templating class
* **updater.py**: the class handling the update of dotfiles for `update`
* **utils.py**: some useful methods
## Config parsing
The configuration file (yaml) is parsed in two layers:
* the lower layer in `cfg_yaml.py`
* the higher layer in `cfg_aggregator.py`
Only the higher layer is accessible to other classes of dotdrop.
The lower layer part is only taking care of basic types and
does the following:
* normalize all config entries
* resolve paths (dotfiles src, dotpath, etc)
* refactor actions to a common format
* etc
* import any data from external files (configs, variables, etc)
* apply variable substitutions
* complete any data if needed (add the "profile" variable, etc)
* execute intrepreted variables through the shell
* write new entries (dotfile, profile) into the dictionary and save it to a file
* fix any deprecated entries (link_by_default, etc)
* clear empty entries
In the end it makes sure the dictionary (or parts of it) accessed
by the higher layer is clean and normalized.
The higher layer will transform the dictionary parsed by the lower layer
into objects (profiles, dotfiles, actions, etc).
The higher layer has no notion of inclusion (profile included for example) or
file importing (import actions, etc) or even interpreted variables
(it only sees variables that have already been interpreted).
It does the following:
* transform dictionaries into objects
* patch list of keys with its corresponding object (for example dotfile's actions)
* provide getters for every other classes of dotdrop needing to access elements
Note that any change to the yaml dictionary (adding a new profile or a new dotfile for
example) won't be *seen* by the higher layer until the config is reloaded. Consider the
`dirty` flag as a sign the file needs to be written and its representation in higher
levels in not accurate anymore.
# Testing
Dotdrop is tested with the use of the [tests.sh](/tests.sh) script.

View File

@@ -38,6 +38,7 @@ _dotdrop ()
'import'
'compare'
'update'
'remove'
'listfiles'
'detail'
'list'
@@ -59,6 +60,9 @@ _dotdrop ()
update)
_dotdrop-update
;;
remove)
_dotdrop-remove
;;
listfiles)
_dotdrop-listfiles
;;
@@ -96,6 +100,8 @@ _dotdrop-install ()
'(--dry)--dry' \
'(-D)-D' \
'(--showdiff)--showdiff' \
'(-a)-a' \
'(--force-actions)--force-actions' \
'(-c=-)-c=-' \
'(--cfg=-)--cfg=-' \
'(-p=-)-p=-' \
@@ -191,6 +197,35 @@ _dotdrop-update ()
fi
}
_dotdrop-remove ()
{
local context state state_descr line
typeset -A opt_args
if [[ $words[$CURRENT] == -* ]] ; then
_arguments -C \
':command:->command' \
'(-V)-V' \
'(--verbose)--verbose' \
'(-b)-b' \
'(--no-banner)--no-banner' \
'(-f)-f' \
'(--force)--force' \
'(-d)-d' \
'(--dry)--dry' \
'(-k)-k' \
'(--key)--key' \
'(-c=-)-c=-' \
'(--cfg=-)--cfg=-' \
'(-p=-)-p=-' \
'(--profile=-)--profile=-' \
else
myargs=('<path>')
_message_next_arg
fi
}
_dotdrop-listfiles ()
{
local context state state_descr line

View File

@@ -38,6 +38,7 @@ _dotdrop.sh ()
'import'
'compare'
'update'
'remove'
'listfiles'
'detail'
'list'
@@ -59,6 +60,9 @@ _dotdrop.sh ()
update)
_dotdrop.sh-update
;;
remove)
_dotdrop.sh-remove
;;
listfiles)
_dotdrop.sh-listfiles
;;
@@ -96,6 +100,8 @@ _dotdrop.sh-install ()
'(--dry)--dry' \
'(-D)-D' \
'(--showdiff)--showdiff' \
'(-a)-a' \
'(--force-actions)--force-actions' \
'(-c=-)-c=-' \
'(--cfg=-)--cfg=-' \
'(-p=-)-p=-' \
@@ -191,6 +197,35 @@ _dotdrop.sh-update ()
fi
}
_dotdrop.sh-remove ()
{
local context state state_descr line
typeset -A opt_args
if [[ $words[$CURRENT] == -* ]] ; then
_arguments -C \
':command:->command' \
'(-V)-V' \
'(--verbose)--verbose' \
'(-b)-b' \
'(--no-banner)--no-banner' \
'(-f)-f' \
'(--force)--force' \
'(-d)-d' \
'(--dry)--dry' \
'(-k)-k' \
'(--key)--key' \
'(-c=-)-c=-' \
'(--cfg=-)--cfg=-' \
'(-p=-)-p=-' \
'(--profile=-)--profile=-' \
else
myargs=('<path>')
_message_next_arg
fi
}
_dotdrop.sh-listfiles ()
{
local context state state_descr line

17
completion/dotdrop-completion.bash Executable file → Normal file
View File

@@ -5,7 +5,7 @@ _dotdrop()
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -eq 1 ]; then
COMPREPLY=( $( compgen -W '-h --help -v --version install import compare update listfiles detail list' -- $cur) )
COMPREPLY=( $( compgen -W '-h --help -v --version install import compare update remove listfiles detail list' -- $cur) )
else
case ${COMP_WORDS[1]} in
install)
@@ -19,6 +19,9 @@ _dotdrop()
;;
update)
_dotdrop_update
;;
remove)
_dotdrop_remove
;;
listfiles)
_dotdrop_listfiles
@@ -40,7 +43,7 @@ _dotdrop_install()
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -ge 2 ]; then
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -t --temp -f --force -n --nodiff -d --dry -D --showdiff -c= --cfg= -p= --profile= ' -- $cur) )
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -t --temp -f --force -n --nodiff -d --dry -D --showdiff -a --force-actions -c= --cfg= -p= --profile= ' -- $cur) )
fi
}
@@ -74,6 +77,16 @@ _dotdrop_update()
fi
}
_dotdrop_remove()
{
local cur
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -ge 2 ]; then
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -f --force -d --dry -k --key -c= --cfg= -p= --profile= ' -- $cur) )
fi
}
_dotdrop_listfiles()
{
local cur

View File

@@ -5,7 +5,7 @@ _dotdropsh()
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -eq 1 ]; then
COMPREPLY=( $( compgen -W '-h --help -v --version install import compare update listfiles detail list' -- $cur) )
COMPREPLY=( $( compgen -W '-h --help -v --version install import compare update remove listfiles detail list' -- $cur) )
else
case ${COMP_WORDS[1]} in
install)
@@ -19,6 +19,9 @@ _dotdropsh()
;;
update)
_dotdropsh_update
;;
remove)
_dotdropsh_remove
;;
listfiles)
_dotdropsh_listfiles
@@ -40,7 +43,7 @@ _dotdropsh_install()
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -ge 2 ]; then
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -t --temp -f --force -n --nodiff -d --dry -D --showdiff -c= --cfg= -p= --profile= ' -- $cur) )
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -t --temp -f --force -n --nodiff -d --dry -D --showdiff -a --force-actions -c= --cfg= -p= --profile= ' -- $cur) )
fi
}
@@ -74,6 +77,16 @@ _dotdropsh_update()
fi
}
_dotdropsh_remove()
{
local cur
cur="${COMP_WORDS[COMP_CWORD]}"
if [ $COMP_CWORD -ge 2 ]; then
COMPREPLY=( $( compgen -fW '-V --verbose -b --no-banner -f --force -d --dry -k --key -c= --cfg= -p= --profile= ' -- $cur) )
fi
}
_dotdropsh_listfiles()
{
local cur

View File

@@ -1,11 +1,11 @@
config:
backup: true
banner: true
create: true
dotpath: dotfiles
banner: true
longkey: false
keepdot: false
link_import_default: nolink
link_dotfile_default: nolink
link_on_import: nolink
longkey: false
dotfiles:
profiles:

View File

@@ -10,10 +10,10 @@ import subprocess
import os
# local imports
from dotdrop.logger import Logger
from dotdrop.dictparser import DictParser
class Cmd:
class Cmd(DictParser):
eq_ignore = ('log',)
def __init__(self, key, action):
@@ -23,7 +23,10 @@ class Cmd:
"""
self.key = key
self.action = action
self.log = Logger()
@classmethod
def _adjust_yaml_keys(cls, value):
return {'action': value}
def __str__(self):
return 'key:{} -> \"{}\"'.format(self.key, self.action)
@@ -50,20 +53,35 @@ class Cmd:
class Action(Cmd):
def __init__(self, key, kind, action, *args):
pre = 'pre'
post = 'post'
def __init__(self, key, kind, action):
"""constructor
@key: action key
@kind: type of action (pre or post)
@action: action string
@args: action arguments
"""
super(Action, self).__init__(key, action)
self.kind = kind
self.args = args
self.args = []
@classmethod
def parse(cls, key, value):
"""parse key value into object"""
v = {}
v['kind'], v['action'] = value
return cls(key=key, **v)
def copy(self, args):
"""return a copy of this object with arguments"""
action = Action(self.key, self.kind, self.action)
action.args = args
return action
def __str__(self):
out = '{}: \"{}\" with args: {}'
return out.format(self.key, self.action, self.args)
out = '{}: \"{}\" ({})'
return out.format(self.key, self.action, self.kind)
def __repr__(self):
return 'action({})'.format(self.__str__())
@@ -74,6 +92,7 @@ class Action(Cmd):
action = self.action
if templater:
action = templater.generate_string(self.action)
cmd = action
try:
cmd = action.format(*self.args)
except IndexError:
@@ -94,9 +113,11 @@ class Action(Cmd):
class Transform(Cmd):
def transform(self, arg0, arg1):
"""execute transformation with {0} and {1}
where {0} is the file to transform and
{1} is the result file"""
"""
execute transformation with {0} and {1}
where {0} is the file to transform
and {1} is the result file
"""
ret = 1
cmd = self.action.format(arg0, arg1)
if os.path.exists(arg1):

352
dotdrop/cfg_aggregator.py Normal file
View File

@@ -0,0 +1,352 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
handle higher level of the config file
"""
import os
import shlex
# local imports
from dotdrop.cfg_yaml import CfgYaml
from dotdrop.dotfile import Dotfile
from dotdrop.settings import Settings
from dotdrop.profile import Profile
from dotdrop.action import Action, Transform
from dotdrop.logger import Logger
from dotdrop.utils import strip_home
TILD = '~'
class CfgAggregator:
file_prefix = 'f'
dir_prefix = 'd'
key_sep = '_'
def __init__(self, path, profile=None, debug=False):
"""
high level config parser
@path: path to the config file
@profile: selected profile
@debug: debug flag
"""
self.path = path
self.profile = profile
self.debug = debug
self.log = Logger()
self._load()
def _load(self):
"""load lower level config"""
self.cfgyaml = CfgYaml(self.path,
self.profile,
debug=self.debug)
# settings
self.settings = Settings.parse(None, self.cfgyaml.settings)
if self.debug:
self.log.dbg('settings: {}'.format(self.settings))
# dotfiles
self.dotfiles = Dotfile.parse_dict(self.cfgyaml.dotfiles)
if self.debug:
self.log.dbg('dotfiles: {}'.format(self.dotfiles))
# profiles
self.profiles = Profile.parse_dict(self.cfgyaml.profiles)
if self.debug:
self.log.dbg('profiles: {}'.format(self.profiles))
# actions
self.actions = Action.parse_dict(self.cfgyaml.actions)
if self.debug:
self.log.dbg('actions: {}'.format(self.actions))
# trans_r
self.trans_r = Transform.parse_dict(self.cfgyaml.trans_r)
if self.debug:
self.log.dbg('trans_r: {}'.format(self.trans_r))
# trans_w
self.trans_w = Transform.parse_dict(self.cfgyaml.trans_w)
if self.debug:
self.log.dbg('trans_w: {}'.format(self.trans_w))
# variables
self.variables = self.cfgyaml.variables
if self.debug:
self.log.dbg('variables: {}'.format(self.variables))
# patch dotfiles in profiles
self._patch_keys_to_objs(self.profiles,
"dotfiles", self.get_dotfile)
# patch action in dotfiles actions
self._patch_keys_to_objs(self.dotfiles,
"actions", self._get_action_w_args)
# patch action in profiles actions
self._patch_keys_to_objs(self.profiles,
"actions", self._get_action_w_args)
# patch actions in settings default_actions
self._patch_keys_to_objs([self.settings],
"default_actions", self._get_action_w_args)
if self.debug:
msg = 'default actions: {}'.format(self.settings.default_actions)
self.log.dbg(msg)
# patch trans_w/trans_r in dotfiles
self._patch_keys_to_objs(self.dotfiles,
"trans_r", self._get_trans_r, islist=False)
self._patch_keys_to_objs(self.dotfiles,
"trans_w", self._get_trans_w, islist=False)
def _patch_keys_to_objs(self, containers, keys, get_by_key, islist=True):
"""
map for each key in the attribute 'keys' in 'containers'
the returned object from the method 'get_by_key'
"""
if not containers:
return
if self.debug:
self.log.dbg('patching {} ...'.format(keys))
for c in containers:
objects = []
okeys = getattr(c, keys)
if not okeys:
continue
if not islist:
okeys = [okeys]
for k in okeys:
o = get_by_key(k)
if not o:
err = 'bad {} key for \"{}\": {}'.format(keys, c, k)
self.log.err(err)
raise Exception(err)
objects.append(o)
if not islist:
objects = objects[0]
if self.debug:
self.log.dbg('patching {}.{} with {}'.format(c, keys, objects))
setattr(c, keys, objects)
def del_dotfile(self, dotfile):
"""remove this dotfile from the config"""
return self.cfgyaml.del_dotfile(dotfile.key)
def del_dotfile_from_profile(self, dotfile, profile):
"""remove this dotfile from this profile"""
return self.cfgyaml.del_dotfile_from_profile(dotfile.key, profile.key)
def new(self, src, dst, link, profile_key):
"""
import a new dotfile
@src: path in dotpath
@dst: path in FS
@link: LinkType
@profile_key: to which profile
"""
dst = self.path_to_dotfile_dst(dst)
dotfile = self.get_dotfile_by_dst(dst)
if not dotfile:
# get a new dotfile with a unique key
key = self._get_new_dotfile_key(dst)
if self.debug:
self.log.dbg('new dotfile key: {}'.format(key))
# add the dotfile
self.cfgyaml.add_dotfile(key, src, dst, link)
dotfile = Dotfile(key, dst, src)
key = dotfile.key
ret = self.cfgyaml.add_dotfile_to_profile(key, profile_key)
if self.debug:
self.log.dbg('new dotfile {} to profile {}'.format(key,
profile_key))
# reload
self.cfgyaml.save()
if self.debug:
self.log.dbg('RELOADING')
self._load()
return ret
def _get_new_dotfile_key(self, dst):
"""return a new unique dotfile key"""
path = os.path.expanduser(dst)
existing_keys = [x.key for x in self.dotfiles]
if self.settings.longkey:
return self._get_long_key(path, existing_keys)
return self._get_short_key(path, existing_keys)
def _norm_key_elem(self, elem):
"""normalize path element for sanity"""
elem = elem.lstrip('.')
elem = elem.replace(' ', '-')
return elem.lower()
def _split_path_for_key(self, path):
"""return a list of path elements, excluded home path"""
p = strip_home(path)
dirs = []
while True:
p, f = os.path.split(p)
dirs.append(f)
if not p or not f:
break
dirs.reverse()
# remove empty entries
dirs = filter(None, dirs)
# normalize entries
return list(map(self._norm_key_elem, dirs))
def _get_long_key(self, path, keys):
"""
return a unique long key representing the
absolute path of path
"""
dirs = self._split_path_for_key(path)
prefix = self.dir_prefix if os.path.isdir(path) else self.file_prefix
key = self.key_sep.join([prefix] + dirs)
return self._uniq_key(key, keys)
def _get_short_key(self, path, keys):
"""
return a unique key where path
is known not to be an already existing dotfile
"""
dirs = self._split_path_for_key(path)
dirs.reverse()
prefix = self.dir_prefix if os.path.isdir(path) else self.file_prefix
entries = []
for d in dirs:
entries.insert(0, d)
key = self.key_sep.join([prefix] + entries)
if key not in keys:
return key
return self._uniq_key(key, keys)
def _uniq_key(self, key, keys):
"""unique dotfile key"""
newkey = key
cnt = 1
while newkey in keys:
# if unable to get a unique path
# get a random one
newkey = self.key_sep.join([key, cnt])
cnt += 1
return newkey
def path_to_dotfile_dst(self, path):
"""normalize the path to match dotfile dst"""
path = os.path.expanduser(path)
path = os.path.expandvars(path)
path = os.path.abspath(path)
home = os.path.expanduser(TILD) + os.sep
# normalize the path
if path.startswith(home):
path = path[len(home):]
path = os.path.join(TILD, path)
return path
def get_dotfile_by_dst(self, dst):
"""get a dotfile by dst"""
try:
return next(d for d in self.dotfiles if d.dst == dst)
except StopIteration:
return None
def save(self):
"""save the config"""
return self.cfgyaml.save()
def dump(self):
"""dump the config dictionary"""
return self.cfgyaml.dump()
def get_settings(self):
"""return settings as a dict"""
return self.settings.serialize()[Settings.key_yaml]
def get_variables(self):
"""return variables"""
return self.variables
def get_profiles(self):
"""return profiles"""
return self.profiles
def get_profile(self, key):
"""return profile by key"""
try:
return next(x for x in self.profiles if x.key == key)
except StopIteration:
return None
def get_profiles_by_dotfile_key(self, key):
"""return all profiles having this dotfile"""
res = []
for p in self.profiles:
keys = [d.key for d in p.dotfiles]
if key in keys:
res.append(p)
return res
def get_dotfiles(self, profile=None):
"""return dotfiles dict for this profile key"""
if not profile:
return self.dotfiles
try:
pro = self.get_profile(profile)
if not pro:
return []
return pro.dotfiles
except StopIteration:
return []
def get_dotfile(self, key):
"""return dotfile by key"""
try:
return next(x for x in self.dotfiles if x.key == key)
except StopIteration:
return None
def _get_action(self, key):
"""return action by key"""
try:
return next(x for x in self.actions if x.key == key)
except StopIteration:
return None
def _get_action_w_args(self, key):
"""return action by key with the arguments"""
fields = shlex.split(key)
if len(fields) > 1:
# we have args
key, *args = fields
if self.debug:
self.log.dbg('action with parm: {} and {}'.format(key, args))
action = self._get_action(key).copy(args)
else:
action = self._get_action(key)
return action
def _get_trans_r(self, key):
"""return the trans_r with this key"""
try:
return next(x for x in self.trans_r if x.key == key)
except StopIteration:
return None
def _get_trans_w(self, key):
"""return the trans_w with this key"""
try:
return next(x for x in self.trans_w if x.key == key)
except StopIteration:
return None

822
dotdrop/cfg_yaml.py Normal file
View File

@@ -0,0 +1,822 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
handle lower level of the config file
"""
import os
from ruamel.yaml import YAML as yaml
import glob
from copy import deepcopy
# local imports
from dotdrop.settings import Settings
from dotdrop.logger import Logger
from dotdrop.templategen import Templategen
from dotdrop.linktypes import LinkTypes
from dotdrop.utils import shell, uniq_list
from dotdrop.exceptions import YamlException
class CfgYaml:
# global entries
key_settings = 'config'
key_dotfiles = 'dotfiles'
key_profiles = 'profiles'
key_actions = 'actions'
old_key_trans_r = 'trans'
key_trans_r = 'trans_read'
key_trans_w = 'trans_write'
key_variables = 'variables'
key_dvariables = 'dynvariables'
action_pre = 'pre'
action_post = 'post'
# profiles/dotfiles entries
key_dotfile_src = 'src'
key_dotfile_dst = 'dst'
key_dotfile_link = 'link'
key_dotfile_actions = 'actions'
key_dotfile_link_children = 'link_children'
# profile
key_profile_dotfiles = 'dotfiles'
key_profile_include = 'include'
key_profile_variables = 'variables'
key_profile_dvariables = 'dynvariables'
key_profile_actions = 'actions'
key_all = 'ALL'
# import entries
key_import_actions = 'import_actions'
key_import_configs = 'import_configs'
key_import_variables = 'import_variables'
key_import_profile_dfs = 'import'
# settings
key_settings_dotpath = 'dotpath'
key_settings_workdir = 'workdir'
key_settings_link_dotfile_default = 'link_dotfile_default'
key_imp_link = 'link_on_import'
# link values
lnk_nolink = LinkTypes.NOLINK.name.lower()
lnk_link = LinkTypes.LINK.name.lower()
lnk_children = LinkTypes.LINK_CHILDREN.name.lower()
def __init__(self, path, profile=None, debug=False):
"""
config parser
@path: config file path
@profile: the selected profile
@debug: debug flag
"""
self.path = os.path.abspath(path)
self.profile = profile
self.debug = debug
self.log = Logger()
self.dirty = False
self.yaml_dict = self._load_yaml(self.path)
self._fix_deprecated(self.yaml_dict)
self._parse_main_yaml(self.yaml_dict)
if self.debug:
self.log.dbg('before normalization: {}'.format(self.yaml_dict))
# resolve variables
allvars = self._merge_and_apply_variables()
self.variables.update(allvars)
# process imported configs
self._import_configs()
# process other imports
self._resolve_imports()
# process diverse options
self._resolve_rest()
# patch dotfiles paths
self._resolve_dotfile_paths()
if self.debug:
self.log.dbg('after normalization: {}'.format(self.yaml_dict))
def _parse_main_yaml(self, dic):
"""parse the different blocks"""
self.ori_settings = self._get_entry(dic, self.key_settings)
self.settings = Settings(None).serialize().get(self.key_settings)
self.settings.update(self.ori_settings)
# resolve settings paths
p = self._resolve_path(self.settings[self.key_settings_dotpath])
self.settings[self.key_settings_dotpath] = p
p = self._resolve_path(self.settings[self.key_settings_workdir])
self.settings[self.key_settings_workdir] = p
if self.debug:
self.log.dbg('settings: {}'.format(self.settings))
# dotfiles
self.ori_dotfiles = self._get_entry(dic, self.key_dotfiles)
self.dotfiles = deepcopy(self.ori_dotfiles)
keys = self.dotfiles.keys()
if len(keys) != len(list(set(keys))):
dups = [x for x in keys if x not in list(set(keys))]
err = 'duplicate dotfile keys found: {}'.format(dups)
raise YamlException(err)
self.dotfiles = self._norm_dotfiles(self.dotfiles)
if self.debug:
self.log.dbg('dotfiles: {}'.format(self.dotfiles))
# profiles
self.ori_profiles = self._get_entry(dic, self.key_profiles)
self.profiles = deepcopy(self.ori_profiles)
if self.debug:
self.log.dbg('profiles: {}'.format(self.profiles))
# actions
self.ori_actions = self._get_entry(dic, self.key_actions,
mandatory=False)
self.actions = deepcopy(self.ori_actions)
self.actions = self._norm_actions(self.actions)
if self.debug:
self.log.dbg('actions: {}'.format(self.actions))
# trans_r
key = self.key_trans_r
if self.old_key_trans_r in dic:
self.log.warn('\"trans\" is deprecated, please use \"trans_read\"')
dic[self.key_trans_r] = dic[self.old_key_trans_r]
del dic[self.old_key_trans_r]
self.ori_trans_r = self._get_entry(dic, key, mandatory=False)
self.trans_r = deepcopy(self.ori_trans_r)
if self.debug:
self.log.dbg('trans_r: {}'.format(self.trans_r))
# trans_w
self.ori_trans_w = self._get_entry(dic, self.key_trans_w,
mandatory=False)
self.trans_w = deepcopy(self.ori_trans_w)
if self.debug:
self.log.dbg('trans_w: {}'.format(self.trans_w))
# variables
self.ori_variables = self._get_entry(dic,
self.key_variables,
mandatory=False)
self.variables = deepcopy(self.ori_variables)
if self.debug:
self.log.dbg('variables: {}'.format(self.variables))
# dynvariables
self.ori_dvariables = self._get_entry(dic,
self.key_dvariables,
mandatory=False)
self.dvariables = deepcopy(self.ori_dvariables)
if self.debug:
self.log.dbg('dvariables: {}'.format(self.dvariables))
def _resolve_dotfile_paths(self):
"""resolve dotfile paths"""
for dotfile in self.dotfiles.values():
src = dotfile[self.key_dotfile_src]
src = os.path.join(self.settings[self.key_settings_dotpath], src)
dotfile[self.key_dotfile_src] = self._resolve_path(src)
dst = dotfile[self.key_dotfile_dst]
dotfile[self.key_dotfile_dst] = self._resolve_path(dst)
def _merge_and_apply_variables(self):
"""
resolve all variables across the config
apply them to any needed entries
and return the full list of variables
"""
# first construct the list of variables
var = self._get_variables_dict(self.profile, seen=[self.profile])
dvar = self._get_dvariables_dict(self.profile, seen=[self.profile])
# recursive resolve variables
allvars = var.copy()
allvars.update(dvar)
if self.debug:
self.log.dbg('all variables: {}'.format(allvars))
t = Templategen(variables=allvars)
for k in allvars.keys():
val = allvars[k]
while Templategen.var_is_template(val):
val = t.generate_string(val)
allvars[k] = val
t.update_variables(allvars)
# exec dynvariables
for k in dvar.keys():
ret, out = shell(allvars[k])
if not ret:
err = 'command \"{}\" failed: {}'.format(allvars[k], out)
self.log.error(err)
raise YamlException(err)
allvars[k] = out
if self.debug:
self.log.dbg('variables:')
for k, v in allvars.items():
self.log.dbg('\t\"{}\": {}'.format(k, v))
if self.debug:
self.log.dbg('resolve all uses of variables in config')
# now resolve blocks
t = Templategen(variables=allvars)
# dotfiles entries
for k, v in self.dotfiles.items():
# src
src = v.get(self.key_dotfile_src)
v[self.key_dotfile_src] = t.generate_string(src)
# dst
dst = v.get(self.key_dotfile_dst)
v[self.key_dotfile_dst] = t.generate_string(dst)
# actions
new = []
for a in v.get(self.key_dotfile_actions, []):
new.append(t.generate_string(a))
if new:
if self.debug:
self.log.dbg('resolved: {}'.format(new))
v[self.key_dotfile_actions] = new
# external actions paths
new = []
for p in self.settings.get(self.key_import_actions, []):
new.append(t.generate_string(p))
if new:
if self.debug:
self.log.dbg('resolved: {}'.format(new))
self.settings[self.key_import_actions] = new
# external config paths
new = []
for p in self.settings.get(self.key_import_configs, []):
new.append(t.generate_string(p))
if new:
if self.debug:
self.log.dbg('resolved: {}'.format(new))
self.settings[self.key_import_configs] = new
# external variables paths
new = []
for p in self.settings.get(self.key_import_variables, []):
new.append(t.generate_string(p))
if new:
if self.debug:
self.log.dbg('resolved: {}'.format(new))
self.settings[self.key_import_variables] = new
# external profiles dotfiles
for k, v in self.profiles.items():
new = []
for p in v.get(self.key_import_profile_dfs, []):
new.append(t.generate_string(p))
if new:
if self.debug:
self.log.dbg('resolved: {}'.format(new))
v[self.key_import_profile_dfs] = new
return allvars
def _norm_actions(self, actions):
"""
ensure each action is either pre or post explicitely
action entry of the form {action_key: (pre|post, action)}
"""
if not actions:
return actions
new = {}
for k, v in actions.items():
if k == self.action_pre or k == self.action_post:
for key, action in v.items():
new[key] = (k, action)
else:
new[k] = (self.action_post, v)
return new
def _norm_dotfiles(self, dotfiles):
"""normalize dotfiles entries"""
if not dotfiles:
return dotfiles
new = {}
for k, v in dotfiles.items():
# add 'src' as key' if not present
if self.key_dotfile_src not in v:
v[self.key_dotfile_src] = k
new[k] = v
else:
new[k] = v
# fix deprecated trans key
if self.old_key_trans_r in v:
msg = '\"trans\" is deprecated, please use \"trans_read\"'
self.log.warn(msg)
v[self.key_trans_r] = v[self.old_key_trans_r]
del v[self.old_key_trans_r]
new[k] = v
return new
def _get_variables_dict(self, profile, seen, sub=False):
"""return enriched variables"""
variables = {}
if not sub:
# add profile variable
if profile:
variables['profile'] = profile
# add some more variables
p = self.settings.get(self.key_settings_dotpath)
p = self._resolve_path(p)
variables['_dotdrop_dotpath'] = p
variables['_dotdrop_cfgpath'] = self._resolve_path(self.path)
p = self.settings.get(self.key_settings_workdir)
p = self._resolve_path(p)
variables['_dotdrop_workdir'] = p
# variables
variables.update(self.variables)
if not profile or profile not in self.profiles.keys():
return variables
# profile entry
pentry = self.profiles.get(profile)
# inherite profile variables
for inherited_profile in pentry.get(self.key_profile_include, []):
if inherited_profile == profile or inherited_profile in seen:
raise YamlException('\"include\" loop')
seen.append(inherited_profile)
new = self._get_variables_dict(inherited_profile, seen, sub=True)
variables.update(new)
# overwrite with profile variables
for k, v in pentry.get(self.key_profile_variables, {}).items():
variables[k] = v
return variables
def _get_dvariables_dict(self, profile, seen, sub=False):
"""return dynvariables"""
variables = {}
# dynvariables
variables.update(self.dvariables)
if not profile or profile not in self.profiles.keys():
return variables
# profile entry
pentry = self.profiles.get(profile)
# inherite profile dynvariables
for inherited_profile in pentry.get(self.key_profile_include, []):
if inherited_profile == profile or inherited_profile in seen:
raise YamlException('\"include loop\"')
seen.append(inherited_profile)
new = self._get_dvariables_dict(inherited_profile, seen, sub=True)
variables.update(new)
# overwrite with profile dynvariables
for k, v in pentry.get(self.key_profile_dvariables, {}).items():
variables[k] = v
return variables
def _is_glob(self, path):
"""quick test if path is a glob"""
return '*' in path or '?' in path
def _glob_paths(self, paths):
"""glob a list of paths"""
if not isinstance(paths, list):
paths = [paths]
res = []
for p in paths:
if not self._is_glob(p):
res.append(p)
continue
p = os.path.expanduser(p)
new = glob.glob(p)
if not new:
raise YamlException('bad path: {}'.format(p))
res.extend(glob.glob(p))
return res
def _import_variables(self, paths):
"""import external variables from paths"""
if not paths:
return
paths = self._glob_paths(paths)
for p in paths:
path = self._resolve_path(p)
if self.debug:
self.log.dbg('import variables from {}'.format(path))
self.variables = self._import_sub(path, self.key_variables,
self.variables,
mandatory=False)
self.dvariables = self._import_sub(path, self.key_dvariables,
self.dvariables,
mandatory=False)
def _import_actions(self, paths):
"""import external actions from paths"""
if not paths:
return
paths = self._glob_paths(paths)
for p in paths:
path = self._resolve_path(p)
if self.debug:
self.log.dbg('import actions from {}'.format(path))
self.actions = self._import_sub(path, self.key_actions,
self.actions, mandatory=False,
patch_func=self._norm_actions)
def _resolve_imports(self):
"""handle all the imports"""
# settings -> import_variables
imp = self.settings.get(self.key_import_variables, None)
self._import_variables(imp)
# settings -> import_actions
imp = self.settings.get(self.key_import_actions, None)
self._import_actions(imp)
# profiles -> import
for k, v in self.profiles.items():
imp = v.get(self.key_import_profile_dfs, None)
if not imp:
continue
if self.debug:
self.log.dbg('import dotfiles for profile {}'.format(k))
paths = self._glob_paths(imp)
for p in paths:
current = v.get(self.key_dotfiles, [])
path = self._resolve_path(p)
current = self._import_sub(path, self.key_dotfiles,
current, mandatory=False)
v[self.key_dotfiles] = current
def _import_configs(self):
"""import configs from external file"""
# settings -> import_configs
imp = self.settings.get(self.key_import_configs, None)
if not imp:
return
paths = self._glob_paths(imp)
for path in paths:
path = self._resolve_path(path)
if self.debug:
self.log.dbg('import config from {}'.format(path))
sub = CfgYaml(path, debug=self.debug)
# settings is ignored
self.dotfiles = self._merge_dict(self.dotfiles, sub.dotfiles)
self.profiles = self._merge_dict(self.profiles, sub.profiles)
self.actions = self._merge_dict(self.actions, sub.actions)
self.trans_r = self._merge_dict(self.trans_r, sub.trans_r)
self.trans_w = self._merge_dict(self.trans_w, sub.trans_w)
self.variables = self._merge_dict(self.variables,
sub.variables)
self.dvariables = self._merge_dict(self.dvariables,
sub.dvariables)
def _resolve_rest(self):
"""resolve some other parts of the config"""
# profile -> ALL
for k, v in self.profiles.items():
dfs = v.get(self.key_profile_dotfiles, None)
if not dfs:
continue
if self.debug:
self.log.dbg('add ALL to profile {}'.format(k))
if self.key_all in dfs:
v[self.key_profile_dotfiles] = self.dotfiles.keys()
# profiles -> include other profile
for k, v in self.profiles.items():
self._rec_resolve_profile_include(k)
def _rec_resolve_profile_include(self, profile):
"""
recursively resolve include of other profiles's:
* dotfiles
* actions
"""
this_profile = self.profiles[profile]
# include
dotfiles = this_profile.get(self.key_profile_dotfiles, [])
actions = this_profile.get(self.key_profile_actions, [])
includes = this_profile.get(self.key_profile_include, None)
if not includes:
# nothing to include
return dotfiles, actions
if self.debug:
self.log.dbg('{} includes: {}'.format(profile, ','.join(includes)))
self.log.dbg('{} dotfiles before include: {}'.format(profile,
dotfiles))
self.log.dbg('{} actions before include: {}'.format(profile,
actions))
seen = []
for i in uniq_list(includes):
# ensure no include loop occurs
if i in seen:
raise YamlException('\"include loop\"')
seen.append(i)
# included profile even exists
if i not in self.profiles.keys():
self.log.warn('include unknown profile: {}'.format(i))
continue
# recursive resolve
o_dfs, o_actions = self._rec_resolve_profile_include(i)
# merge dotfile keys
dotfiles.extend(o_dfs)
this_profile[self.key_profile_dotfiles] = uniq_list(dotfiles)
# merge actions keys
actions.extend(o_actions)
this_profile[self.key_profile_actions] = uniq_list(actions)
dotfiles = this_profile.get(self.key_profile_dotfiles, [])
actions = this_profile.get(self.key_profile_actions, [])
if self.debug:
self.log.dbg('{} dotfiles after include: {}'.format(profile,
dotfiles))
self.log.dbg('{} actions after include: {}'.format(profile,
actions))
# since dotfiles and actions are resolved here
# and variables have been already done at the beginning
# of the parsing, we can clear these include
self.profiles[profile][self.key_profile_include] = None
return dotfiles, actions
def _resolve_path(self, path):
"""resolve a path either absolute or relative to config path"""
path = os.path.expanduser(path)
if not os.path.isabs(path):
d = os.path.dirname(self.path)
return os.path.join(d, path)
return os.path.normpath(path)
def _import_sub(self, path, key, current,
mandatory=False, patch_func=None):
"""
import the block "key" from "path"
and merge it with "current"
patch_func is applied before merge if defined
"""
if self.debug:
self.log.dbg('import \"{}\" from \"{}\"'.format(key, path))
self.log.dbg('current: {}'.format(current))
extdict = self._load_yaml(path)
new = self._get_entry(extdict, key, mandatory=mandatory)
if patch_func:
new = patch_func(new)
if not new:
self.log.warn('no \"{}\" imported from \"{}\"'.format(key, path))
return
if self.debug:
self.log.dbg('found: {}'.format(new))
if isinstance(current, dict) and isinstance(new, dict):
# imported entries get more priority than current
current = self._merge_dict(new, current)
elif isinstance(current, list) and isinstance(new, list):
current = current + new
else:
raise YamlException('invalid import {} from {}'.format(key, path))
if self.debug:
self.log.dbg('new \"{}\": {}'.format(key, current))
return current
def _merge_dict(self, high, low):
"""merge low into high"""
return {**low, **high}
def _get_entry(self, dic, key, mandatory=True):
"""return entry from yaml dictionary"""
if key not in dic:
if mandatory:
raise YamlException('invalid config: no {} found'.format(key))
dic[key] = {}
return dic[key]
if mandatory and not dic[key]:
# ensure is not none
dic[key] = {}
return dic[key]
def _load_yaml(self, path):
"""load a yaml file to a dict"""
content = {}
if not os.path.exists(path):
raise YamlException('config path not found: {}'.format(path))
try:
content = self._yaml_load(path)
except Exception as e:
self.log.err(e)
raise YamlException('invalid config: {}'.format(path))
return content
def _new_profile(self, key):
"""add a new profile if it doesn't exist"""
if key not in self.profiles.keys():
# update yaml_dict
self.yaml_dict[self.key_profiles][key] = {
self.key_profile_dotfiles: []
}
if self.debug:
self.log.dbg('adding new profile: {}'.format(key))
self.dirty = True
def add_dotfile_to_profile(self, dotfile_key, profile_key):
"""add an existing dotfile key to a profile_key"""
self._new_profile(profile_key)
profile = self.yaml_dict[self.key_profiles][profile_key]
if dotfile_key not in profile[self.key_profile_dotfiles]:
profile[self.key_profile_dotfiles].append(dotfile_key)
if self.debug:
msg = 'add \"{}\" to profile \"{}\"'.format(dotfile_key,
profile_key)
msg.format(dotfile_key, profile_key)
self.log.dbg(msg)
self.dirty = True
return self.dirty
def add_dotfile(self, key, src, dst, link):
"""add a new dotfile"""
if key in self.dotfiles.keys():
return False
if self.debug:
self.log.dbg('adding new dotfile: {}'.format(key))
df_dict = {
self.key_dotfile_src: src,
self.key_dotfile_dst: dst,
}
dfl = self.settings[self.key_settings_link_dotfile_default]
if str(link) != dfl:
df_dict[self.key_dotfile_link] = str(link)
self.yaml_dict[self.key_dotfiles][key] = df_dict
self.dirty = True
def del_dotfile(self, key):
"""remove this dotfile from config"""
if key not in self.yaml_dict[self.key_dotfiles]:
self.log.err('key not in dotfiles: {}'.format(key))
return False
if self.debug:
self.log.dbg('remove dotfile: {}'.format(key))
del self.yaml_dict[self.key_dotfiles][key]
if self.debug:
dfs = self.yaml_dict[self.key_dotfiles]
self.log.dbg('new dotfiles: {}'.format(dfs))
self.dirty = True
return True
def del_dotfile_from_profile(self, df_key, pro_key):
"""remove this dotfile from that profile"""
if df_key not in self.dotfiles.keys():
self.log.err('key not in dotfiles: {}'.format(df_key))
return False
if pro_key not in self.profiles.keys():
self.log.err('key not in profile: {}'.format(pro_key))
return False
# get the profile dictionary
profile = self.yaml_dict[self.key_profiles][pro_key]
if df_key not in profile[self.key_profile_dotfiles]:
return True
if self.debug:
dfs = profile[self.key_profile_dotfiles]
self.log.dbg('{} profile dotfiles: {}'.format(pro_key, dfs))
self.log.dbg('remove {} from profile {}'.format(df_key, pro_key))
profile[self.key_profile_dotfiles].remove(df_key)
if self.debug:
dfs = profile[self.key_profile_dotfiles]
self.log.dbg('{} profile dotfiles: {}'.format(pro_key, dfs))
self.dirty = True
return True
def _fix_deprecated(self, yamldict):
"""fix deprecated entries"""
self._fix_deprecated_link_by_default(yamldict)
self._fix_deprecated_dotfile_link(yamldict)
def _fix_deprecated_link_by_default(self, yamldict):
"""fix deprecated link_by_default"""
key = 'link_by_default'
newkey = self.key_imp_link
if self.key_settings not in yamldict:
return
if not yamldict[self.key_settings]:
return
config = yamldict[self.key_settings]
if key not in config:
return
if config[key]:
config[newkey] = self.lnk_link
else:
config[newkey] = self.lnk_nolink
del config[key]
self.log.warn('deprecated \"link_by_default\"')
self.dirty = True
def _fix_deprecated_dotfile_link(self, yamldict):
"""fix deprecated link in dotfiles"""
if self.key_dotfiles not in yamldict:
return
if not yamldict[self.key_dotfiles]:
return
for k, dotfile in yamldict[self.key_dotfiles].items():
new = self.lnk_nolink
if self.key_dotfile_link in dotfile and \
type(dotfile[self.key_dotfile_link]) is bool:
# patch link: <bool>
cur = dotfile[self.key_dotfile_link]
new = self.lnk_nolink
if cur:
new = self.lnk_link
dotfile[self.key_dotfile_link] = new
self.dirty = True
self.log.warn('deprecated \"link\" value')
elif self.key_dotfile_link_children in dotfile and \
type(dotfile[self.key_dotfile_link_children]) is bool:
# patch link_children: <bool>
cur = dotfile[self.key_dotfile_link_children]
new = self.lnk_nolink
if cur:
new = self.lnk_children
del dotfile[self.key_dotfile_link_children]
dotfile[self.key_dotfile_link] = new
self.dirty = True
self.log.warn('deprecated \"link_children\" value')
def _clear_none(self, dic):
"""recursively delete all none/empty values in a dictionary."""
new = {}
for k, v in dic.items():
newv = v
if isinstance(v, dict):
newv = self._clear_none(v)
if not newv:
# no empty dict
continue
if newv is None:
# no None value
continue
if isinstance(newv, list) and not newv:
# no empty list
continue
new[k] = newv
return new
def save(self):
"""save this instance and return True if saved"""
if not self.dirty:
return False
content = self._clear_none(self.dump())
# make sure we have the base entries
if self.key_settings not in content:
content[self.key_settings] = None
if self.key_dotfiles not in content:
content[self.key_dotfiles] = None
if self.key_profiles not in content:
content[self.key_profiles] = None
# save to file
if self.debug:
self.log.dbg('saving to {}'.format(self.path))
try:
self._yaml_dump(content, self.path)
except Exception as e:
self.log.err(e)
raise YamlException('error saving config: {}'.format(self.path))
self.dirty = False
return True
def dump(self):
"""dump the config dictionary"""
return self.yaml_dict
def _yaml_load(self, path):
"""load from yaml"""
with open(path, 'r') as f:
y = yaml()
y.typ = 'rt'
content = y.load(f)
return content
def _yaml_dump(self, content, path):
"""dump to yaml"""
with open(self.path, 'w') as f:
y = yaml()
y.default_flow_style = False
y.indent = 2
y.typ = 'rt'
y.dump(content, f)

View File

@@ -10,7 +10,7 @@ import filecmp
# local imports
from dotdrop.logger import Logger
import dotdrop.utils as utils
from dotdrop.utils import must_ignore, uniq_list, diff
class Comparator:
@@ -43,7 +43,7 @@ class Comparator:
"""compare a file"""
if self.debug:
self.log.dbg('compare file {} with {}'.format(left, right))
if utils.must_ignore([left, right], ignore, debug=self.debug):
if must_ignore([left, right], ignore, debug=self.debug):
if self.debug:
self.log.dbg('ignoring diff {} and {}'.format(left, right))
return ''
@@ -55,7 +55,7 @@ class Comparator:
self.log.dbg('compare directory {} with {}'.format(left, right))
if not os.path.exists(right):
return ''
if utils.must_ignore([left, right], ignore, debug=self.debug):
if must_ignore([left, right], ignore, debug=self.debug):
if self.debug:
self.log.dbg('ignoring diff {} and {}'.format(left, right))
return ''
@@ -68,15 +68,15 @@ class Comparator:
# handle files only in deployed dir
for i in comp.left_only:
if utils.must_ignore([os.path.join(left, i)],
ignore, debug=self.debug):
if must_ignore([os.path.join(left, i)],
ignore, debug=self.debug):
continue
ret.append('=> \"{}\" does not exist on local\n'.format(i))
# handle files only in dotpath dir
for i in comp.right_only:
if utils.must_ignore([os.path.join(right, i)],
ignore, debug=self.debug):
if must_ignore([os.path.join(right, i)],
ignore, debug=self.debug):
continue
ret.append('=> \"{}\" does not exist in dotdrop\n'.format(i))
@@ -85,8 +85,8 @@ class Comparator:
for i in funny:
lfile = os.path.join(left, i)
rfile = os.path.join(right, i)
if utils.must_ignore([lfile, rfile],
ignore, debug=self.debug):
if must_ignore([lfile, rfile],
ignore, debug=self.debug):
continue
short = os.path.basename(lfile)
# file vs dir
@@ -95,12 +95,12 @@ class Comparator:
# content is different
funny = comp.diff_files
funny.extend(comp.funny_files)
funny = list(set(funny))
funny = uniq_list(funny)
for i in funny:
lfile = os.path.join(left, i)
rfile = os.path.join(right, i)
if utils.must_ignore([lfile, rfile],
ignore, debug=self.debug):
if must_ignore([lfile, rfile],
ignore, debug=self.debug):
continue
diff = self._diff(lfile, rfile, header=True)
ret.append(diff)
@@ -115,9 +115,9 @@ class Comparator:
def _diff(self, left, right, header=False):
"""diff using the unix tool diff"""
diff = utils.diff(left, right, raw=False,
opts=self.diffopts, debug=self.debug)
out = diff(left, right, raw=False,
opts=self.diffopts, debug=self.debug)
if header:
lshort = os.path.basename(left)
diff = '=> diff \"{}\":\n{}'.format(lshort, diff)
return diff
out = '=> diff \"{}\":\n{}'.format(lshort, out)
return out

File diff suppressed because it is too large Load Diff

38
dotdrop/dictparser.py Normal file
View File

@@ -0,0 +1,38 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
dictionary parser abstract class
"""
from dotdrop.logger import Logger
class DictParser:
log = Logger()
@classmethod
def _adjust_yaml_keys(cls, value):
"""adjust value for object 'cls'"""
return value
@classmethod
def parse(cls, key, value):
"""parse (key,value) and construct object 'cls'"""
tmp = value
try:
tmp = value.copy()
except AttributeError:
pass
newv = cls._adjust_yaml_keys(tmp)
if not key:
return cls(**newv)
return cls(key=key, **newv)
@classmethod
def parse_dict(cls, items):
"""parse a dictionary and construct object 'cls'"""
if not items:
return []
return [cls.parse(k, v) for k, v in items.items()]

View File

@@ -15,9 +15,10 @@ from dotdrop.templategen import Templategen
from dotdrop.installer import Installer
from dotdrop.updater import Updater
from dotdrop.comparator import Comparator
from dotdrop.config import Cfg
from dotdrop.utils import get_tmpdir, remove, strip_home, run
from dotdrop.utils import get_tmpdir, remove, strip_home, \
run, uniq_list, patch_ignores
from dotdrop.linktypes import LinkTypes
from dotdrop.exceptions import YamlException
LOG = Logger()
TRANS_SUFFIX = 'trans'
@@ -27,7 +28,7 @@ TRANS_SUFFIX = 'trans'
###########################################################
def action_executor(o, dotfile, actions, defactions, templater, post=False):
def action_executor(o, actions, defactions, templater, post=False):
"""closure for action execution"""
def execute():
"""
@@ -70,9 +71,14 @@ def action_executor(o, dotfile, actions, defactions, templater, post=False):
def cmd_install(o):
"""install dotfiles for this profile"""
dotfiles = o.dotfiles
prof = o.conf.get_profile(o.profile)
pro_pre_actions = prof.get_pre_actions()
pro_post_actions = prof.get_post_actions()
if o.install_keys:
# filtered dotfiles to install
dotfiles = [d for d in dotfiles if d.key in set(o.install_keys)]
uniq = uniq_list(o.install_keys)
dotfiles = [d for d in dotfiles if d.key in uniq]
if not dotfiles:
msg = 'no dotfile to install for this profile (\"{}\")'
LOG.warn(msg.format(o.profile))
@@ -92,20 +98,27 @@ def cmd_install(o):
backup_suffix=o.install_backup_suffix)
installed = 0
tvars = t.add_tmp_vars()
# execute profile pre-action
if o.debug:
LOG.dbg('execute profile pre actions')
ret, err = action_executor(o, pro_pre_actions, [], t, post=False)()
if not ret:
return False
# install each dotfile
for dotfile in dotfiles:
# add dotfile variables
t.restore_vars(tvars)
newvars = dotfile.get_vars()
newvars = dotfile.get_dotfile_variables()
t.add_tmp_vars(newvars=newvars)
preactions = []
if not o.install_temporary and dotfile.actions \
and Cfg.key_actions_pre in dotfile.actions:
for action in dotfile.actions[Cfg.key_actions_pre]:
preactions.append(action)
defactions = o.install_default_actions[Cfg.key_actions_pre]
pre_actions_exec = action_executor(o, dotfile, preactions,
defactions, t, post=False)
if not o.install_temporary:
preactions.extend(dotfile.get_pre_actions())
defactions = o.install_default_actions_pre
pre_actions_exec = action_executor(o, preactions, defactions,
t, post=False)
if o.debug:
LOG.dbg('installing {}'.format(dotfile))
@@ -132,16 +145,39 @@ def cmd_install(o):
if os.path.exists(tmp):
remove(tmp)
if r:
if not o.install_temporary and \
Cfg.key_actions_post in dotfile.actions:
defactions = o.install_default_actions[Cfg.key_actions_post]
postactions = dotfile.actions[Cfg.key_actions_post]
post_actions_exec = action_executor(o, dotfile, postactions,
defactions, t, post=True)
# dotfile was installed
if not o.install_temporary:
defactions = o.install_default_actions_post
postactions = dotfile.get_post_actions()
post_actions_exec = action_executor(o, postactions, defactions,
t, post=True)
post_actions_exec()
installed += 1
elif not r and err:
LOG.err('installing \"{}\" failed: {}'.format(dotfile.key, err))
elif not r:
# dotfile was NOT installed
if o.install_force_action:
# pre-actions
if o.debug:
LOG.dbg('force pre action execution ...')
pre_actions_exec()
# post-actions
LOG.dbg('force post action execution ...')
postactions = dotfile.get_post_actions()
post_actions_exec = action_executor(o, postactions, defactions,
t, post=True)
post_actions_exec()
if err:
LOG.err('installing \"{}\" failed: {}'.format(dotfile.key,
err))
# execute profile post-action
if installed > 0 or o.install_force_action:
if o.debug:
LOG.dbg('execute profile post actions')
ret, err = action_executor(o, pro_post_actions, [], t, post=False)()
if not ret:
return False
if o.install_temporary:
LOG.log('\ninstalled to tmp \"{}\".'.format(tmpdir))
LOG.log('\n{} dotfile(s) installed.'.format(installed))
@@ -185,6 +221,8 @@ def cmd_compare(o, tmp):
tmpsrc = None
if dotfile.trans_r:
# apply transformation
if o.debug:
LOG.dbg('applying transformation before comparing')
tmpsrc = apply_trans(o.dotpath, dotfile, debug=o.debug)
if not tmpsrc:
# could not apply trans
@@ -209,6 +247,7 @@ def cmd_compare(o, tmp):
same = False
continue
ignores = list(set(o.compare_ignore + dotfile.cmpignore))
ignores = patch_ignores(ignores, dotfile.dst, debug=o.debug)
diff = comp.compare(insttmp, dotfile.dst, ignore=ignores)
if tmpsrc:
# clean tmp transformed dotfile if any
@@ -253,7 +292,10 @@ def cmd_update(o):
if o.debug:
LOG.dbg('dotfile to update: {}'.format(paths))
updater = Updater(o.dotpath, o.dotfiles, o.variables,
updater = Updater(o.dotpath, o.variables,
o.conf.get_dotfile,
o.conf.get_dotfile_by_dst,
o.conf.path_to_dotfile_dst,
dry=o.dry, safe=o.safe, debug=o.debug,
ignore=ignore, showpatch=showpatch)
if not iskey:
@@ -329,8 +371,7 @@ def cmd_importer(o):
LOG.err('importing \"{}\" failed!'.format(path))
ret = False
continue
retconf, dotfile = o.conf.new(src, dst, o.profile,
linktype, debug=o.debug)
retconf = o.conf.new(src, dst, linktype, o.profile)
if retconf:
LOG.sub('\"{}\" imported'.format(path))
cnt += 1
@@ -355,7 +396,7 @@ def cmd_list_profiles(o):
def cmd_list_files(o):
"""list all dotfiles for a specific profile"""
if o.profile not in o.profiles:
if o.profile not in [p.key for p in o.profiles]:
LOG.warn('unknown profile \"{}\"'.format(o.profile))
return
what = 'Dotfile(s)'
@@ -375,26 +416,95 @@ def cmd_list_files(o):
def cmd_detail(o):
"""list details on all files for all dotfile entries"""
if o.profile not in o.profiles:
if o.profile not in [p.key for p in o.profiles]:
LOG.warn('unknown profile \"{}\"'.format(o.profile))
return
dotfiles = o.dotfiles
if o.detail_keys:
# filtered dotfiles to install
dotfiles = [d for d in dotfiles if d.key in set(o.details_keys)]
uniq = uniq_list(o.details_keys)
dotfiles = [d for d in dotfiles if d.key in uniq]
LOG.emph('dotfiles details for profile \"{}\":\n'.format(o.profile))
for d in dotfiles:
_detail(o.dotpath, d)
LOG.log('')
def cmd_remove(o):
"""remove dotfile from dotpath and from config"""
paths = o.remove_path
iskey = o.remove_iskey
if not paths:
LOG.log('no dotfile to remove')
return False
if o.debug:
LOG.dbg('dotfile(s) to remove: {}'.format(','.join(paths)))
removed = []
for key in paths:
if not iskey:
# by path
dotfile = o.conf.get_dotfile_by_dst(key)
if not dotfile:
LOG.warn('{} ignored, does not exist'.format(key))
continue
k = dotfile.key
else:
# by key
dotfile = o.conf.get_dotfile(key)
if not dotfile:
LOG.warn('{} ignored, does not exist'.format(key))
continue
k = key
if o.debug:
LOG.dbg('removing {}'.format(key))
# make sure is part of the profile
if dotfile.key not in [d.key for d in o.dotfiles]:
LOG.warn('{} ignored, not associated to this profile'.format(key))
continue
profiles = o.conf.get_profiles_by_dotfile_key(k)
pkeys = ','.join([p.key for p in profiles])
if o.dry:
LOG.dry('would remove {} from {}'.format(dotfile, pkeys))
continue
msg = 'Remove \"{}\" from all these profiles: {}'.format(k, pkeys)
if o.safe and not LOG.ask(msg):
return False
if o.debug:
LOG.dbg('remove dotfile: {}'.format(dotfile))
for profile in profiles:
if not o.conf.del_dotfile_from_profile(dotfile, profile):
return False
if not o.conf.del_dotfile(dotfile):
return False
# remove dotfile from dotpath
dtpath = os.path.join(o.dotpath, dotfile.src)
remove(dtpath)
removed.append(dotfile.key)
if o.dry:
LOG.dry('new config file would be:')
LOG.raw(o.conf.dump())
else:
o.conf.save()
if removed:
LOG.log('\ndotfile(s) removed: {}'.format(','.join(removed)))
else:
LOG.log('\nno dotfile removed')
return True
###########################################################
# helpers
###########################################################
def _detail(dotpath, dotfile):
"""print details on all files under a dotfile entry"""
"""display details on all files under a dotfile entry"""
LOG.log('{} (dst: \"{}\", link: {})'.format(dotfile.key, dotfile.dst,
dotfile.link.name.lower()))
path = os.path.join(dotpath, os.path.expanduser(dotfile.src))
@@ -404,7 +514,7 @@ def _detail(dotpath, dotfile):
template = 'yes'
LOG.sub('{} (template:{})'.format(path, template))
else:
for root, dir, files in os.walk(path):
for root, _, files in os.walk(path):
for f in files:
p = os.path.join(root, f)
template = 'no'
@@ -429,8 +539,10 @@ def _select(selections, dotfiles):
def apply_trans(dotpath, dotfile, debug=False):
"""apply the read transformation to the dotfile
return None if fails and new source if succeed"""
"""
apply the read transformation to the dotfile
return None if fails and new source if succeed
"""
src = dotfile.src
new_src = '{}.{}'.format(src, TRANS_SUFFIX)
trans = dotfile.trans_r
@@ -456,10 +568,13 @@ def main():
"""entry point"""
try:
o = Options()
except ValueError as e:
LOG.err('Config error: {}'.format(str(e)))
except YamlException as e:
LOG.err('config file error: {}'.format(str(e)))
return False
if o.debug:
LOG.dbg('\n\n')
ret = True
try:
@@ -508,13 +623,18 @@ def main():
LOG.dbg('running cmd: detail')
cmd_detail(o)
elif o.cmd_remove:
# remove dotfile
if o.debug:
LOG.dbg('running cmd: remove')
cmd_remove(o)
except KeyboardInterrupt:
LOG.err('interrupted')
ret = False
if ret and o.conf.is_modified():
if ret and o.conf.save():
LOG.log('config file updated')
o.conf.save()
return ret

View File

@@ -6,15 +6,23 @@ represents a dotfile in dotdrop
"""
from dotdrop.linktypes import LinkTypes
from dotdrop.dictparser import DictParser
from dotdrop.action import Action
class Dotfile:
class Dotfile(DictParser):
"""Represent a dotfile."""
# dotfile keys
key_noempty = 'ignoreempty'
key_trans_r = 'trans_read'
key_trans_w = 'trans_write'
def __init__(self, key, dst, src,
actions={}, trans_r=None, trans_w=None,
actions=[], trans_r=None, trans_w=None,
link=LinkTypes.NOLINK, cmpignore=[],
noempty=False, upignore=[]):
"""constructor
"""
constructor
@key: dotfile key
@dst: dotfile dst (in user's home usually)
@src: dotfile src (in dotpath)
@@ -26,39 +34,75 @@ class Dotfile:
@noempty: ignore empty template if True
@upignore: patterns to ignore when updating
"""
self.key = key
self.dst = dst
self.src = src
self.link = link
# ensure link of right type
if type(link) != LinkTypes:
raise Exception('bad value for link: {}'.format(link))
self.actions = actions
self.cmpignore = cmpignore
self.dst = dst
self.key = key
self.link = LinkTypes.get(link)
self.noempty = noempty
self.src = src
self.trans_r = trans_r
self.trans_w = trans_w
self.cmpignore = cmpignore
self.noempty = noempty
self.upignore = upignore
def get_vars(self):
"""return this dotfile templating vars"""
_vars = {}
_vars['_dotfile_abs_src'] = self.src
_vars['_dotfile_abs_dst'] = self.dst
_vars['_dotfile_key'] = self.key
_vars['_dotfile_link'] = self.link.name.lower()
if link != LinkTypes.NOLINK and \
(
(trans_r and len(trans_r) > 0)
or
(trans_w and len(trans_w) > 0)
):
msg = '[{}] transformations disabled'.format(key)
msg += ' because dotfile is linked'
self.log.warn(msg)
trans_r = []
trans_w = []
return _vars
def get_dotfile_variables(self):
"""return this dotfile specific variables"""
return {
'_dotfile_abs_src': self.src,
'_dotfile_abs_dst': self.dst,
'_dotfile_key': self.key,
'_dotfile_link': str(self.link),
}
def __str__(self):
msg = 'key:\"{}\", src:\"{}\", dst:\"{}\", link:\"{}\"'
return msg.format(self.key, self.src, self.dst, self.link.name.lower())
def get_pre_actions(self):
"""return all 'pre' actions"""
return [a for a in self.actions if a.kind == Action.pre]
def __repr__(self):
return 'dotfile({})'.format(self.__str__())
def get_post_actions(self):
"""return all 'post' actions"""
return [a for a in self.actions if a.kind == Action.post]
def get_trans_r(self):
"""return trans_r object"""
return self.trans_r
def get_trans_w(self):
"""return trans_w object"""
return self.trans_w
@classmethod
def _adjust_yaml_keys(cls, value):
"""patch dict"""
value['noempty'] = value.get(cls.key_noempty, False)
value['trans_r'] = value.get(cls.key_trans_r)
value['trans_w'] = value.get(cls.key_trans_w)
# remove old entries
value.pop(cls.key_noempty, None)
value.pop(cls.key_trans_r, None)
value.pop(cls.key_trans_w, None)
return value
def __eq__(self, other):
return self.__dict__ == other.__dict__
def __hash__(self):
return hash(self.dst) ^ hash(self.src) ^ hash(self.key)
def __str__(self):
msg = 'key:\"{}\", src:\"{}\", dst:\"{}\", link:\"{}\"'
return msg.format(self.key, self.src, self.dst, str(self.link))
def __repr__(self):
return 'dotfile({!s})'.format(self)

11
dotdrop/exceptions.py Normal file
View File

@@ -0,0 +1,11 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
diverse exceptions
"""
class YamlException(Exception):
"""exception in CfgYaml"""
pass

View File

@@ -211,11 +211,17 @@ class Installer:
overwrite = not self.safe
if os.path.lexists(dst):
if os.path.realpath(dst) == os.path.realpath(src):
err = 'ignoring "{}", link exists'.format(dst)
return False, err
msg = 'ignoring "{}", link already exists'.format(dst)
if self.debug:
self.log.dbg(msg)
return True, None
if self.dry:
self.log.dry('would remove {} and link to {}'.format(dst, src))
return True, None
if self.showdiff:
with open(src, 'rb') as f:
content = f.read()
self._diff_before_write(src, dst, content)
msg = 'Remove "{}" for link creation?'.format(dst)
if self.safe and not self.log.ask(msg):
err = 'ignoring "{}", link was not created'.format(dst)

View File

@@ -5,3 +5,15 @@ class LinkTypes(IntEnum):
NOLINK = 0
LINK = 1
LINK_CHILDREN = 2
@classmethod
def get(cls, key, default=None):
try:
return key if isinstance(key, cls) else cls[key.upper()]
except KeyError:
if default:
return default
raise ValueError('bad {} value: "{}"'.format(cls.__name__, key))
def __str__(self):
return self.name.lower()

View File

@@ -16,8 +16,10 @@ class Logger:
YELLOW = '\033[93m'
BLUE = '\033[94m'
MAGENTA = '\033[95m'
LMAGENTA = '\033[35m'
RESET = '\033[0m'
EMPH = '\033[33m'
BOLD = '\033[1m'
def __init__(self):
pass
@@ -40,7 +42,8 @@ class Logger:
def err(self, string, end='\n'):
cs = self._color(self.RED)
ce = self._color(self.RESET)
sys.stderr.write('{}[ERR] {} {}{}'.format(cs, string, end, ce))
msg = '{} {}'.format(string, end)
sys.stderr.write('{}[ERR] {}{}'.format(cs, msg, ce))
def warn(self, string, end='\n'):
cs = self._color(self.YELLOW)
@@ -53,8 +56,10 @@ class Logger:
func = inspect.stack()[1][3]
cs = self._color(self.MAGENTA)
ce = self._color(self.RESET)
line = '{}[DEBUG][{}.{}] {}{}\n'
sys.stderr.write(line.format(cs, mod, func, string, ce))
cl = self._color(self.LMAGENTA)
bl = self._color(self.BOLD)
line = '{}{}[DEBUG][{}.{}]{}{} {}{}\n'
sys.stderr.write(line.format(bl, cl, mod, func, ce, cs, string, ce))
def dry(self, string, end='\n'):
cs = self._color(self.GREEN)

View File

@@ -14,7 +14,8 @@ from docopt import docopt
from dotdrop.version import __version__ as VERSION
from dotdrop.linktypes import LinkTypes
from dotdrop.logger import Logger
from dotdrop.config import Cfg
from dotdrop.cfg_aggregator import CfgAggregator as Cfg
from dotdrop.action import Action
ENV_PROFILE = 'DOTDROP_PROFILE'
ENV_CONFIG = 'DOTDROP_CONFIG'
@@ -49,15 +50,17 @@ USAGE = """
{}
Usage:
dotdrop install [-VbtfndD] [-c <path>] [-p <profile>] [<key>...]
dotdrop import [-Vbd] [-c <path>] [-p <profile>] [-l <link>] <path>...
dotdrop compare [-Vb] [-c <path>] [-p <profile>]
[-o <opts>] [-C <file>...] [-i <pattern>...]
dotdrop update [-VbfdkP] [-c <path>] [-p <profile>]
[-i <pattern>...] [<path>...]
dotdrop listfiles [-VbT] [-c <path>] [-p <profile>]
dotdrop detail [-Vb] [-c <path>] [-p <profile>] [<key>...]
dotdrop list [-Vb] [-c <path>]
dotdrop install [-VbtfndDa] [-c <path>] [-p <profile>] [<key>...]
dotdrop import [-Vbd] [-c <path>] [-p <profile>]
[-l <link>] <path>...
dotdrop compare [-Vb] [-c <path>] [-p <profile>]
[-o <opts>] [-C <file>...] [-i <pattern>...]
dotdrop update [-VbfdkP] [-c <path>] [-p <profile>]
[-i <pattern>...] [<path>...]
dotdrop remove [-Vbfdk] [-c <path>] [-p <profile>] [<path>...]
dotdrop listfiles [-VbT] [-c <path>] [-p <profile>]
dotdrop detail [-Vb] [-c <path>] [-p <profile>] [<key>...]
dotdrop list [-Vb] [-c <path>]
dotdrop --help
dotdrop --version
@@ -74,6 +77,7 @@ Options:
-D --showdiff Show a diff before overwriting.
-P --show-patch Provide a one-liner to manually patch template.
-f --force Do not ask user confirmation for anything.
-a --force-actions Execute all actions even if no dotfile is installed.
-k --key Treat <path> as a dotfile key.
-V --verbose Be verbose.
-d --dry Dry run.
@@ -107,24 +111,24 @@ class Options(AttrMonitor):
if not args:
self.args = docopt(USAGE, version=VERSION)
self.log = Logger()
self.debug = self.args['--verbose']
if not self.debug and ENV_DEBUG in os.environ:
self.debug = True
self.debug = self.args['--verbose'] or ENV_DEBUG in os.environ
if ENV_NODEBUG in os.environ:
# force disabling debugs
self.debug = False
self.profile = self.args['--profile']
self.confpath = self._get_config_path()
if self.debug:
self.log.dbg('version: {}'.format(VERSION))
self.log.dbg('config file: {}'.format(self.confpath))
self._read_config(self.profile)
self._read_config()
self._apply_args()
self._fill_attr()
if ENV_NOBANNER not in os.environ \
and self.banner \
and not self.args['--no-banner']:
self._header()
self._print_attr()
self._debug_attr()
# start monitoring for bad attribute
self._set_attr_err = True
@@ -167,25 +171,18 @@ class Options(AttrMonitor):
return None
def _find_cfg(self, paths):
"""try to find the config in the paths list"""
for path in paths:
if os.path.exists(path):
return path
return None
def _header(self):
"""print the header"""
"""display the header"""
self.log.log(BANNER)
self.log.log('')
def _read_config(self, profile=None):
def _read_config(self):
"""read the config file"""
self.conf = Cfg(self.confpath, profile=profile, debug=self.debug)
self.conf = Cfg(self.confpath, self.profile, debug=self.debug)
# transform the config settings to self attribute
for k, v in self.conf.get_settings().items():
if self.debug:
self.log.dbg('setting: {}={}'.format(k, v))
self.log.dbg('new setting: {}={}'.format(k, v))
setattr(self, k, v)
def _apply_args(self):
@@ -198,6 +195,7 @@ class Options(AttrMonitor):
self.cmd_import = self.args['import']
self.cmd_update = self.args['update']
self.cmd_detail = self.args['detail']
self.cmd_remove = self.args['remove']
# adapt attributes based on arguments
self.dry = self.args['--dry']
@@ -212,18 +210,20 @@ class Options(AttrMonitor):
self.log.err('bad option for --link: {}'.format(link))
sys.exit(USAGE)
self.import_link = OPT_LINK[link]
if self.debug:
self.log.dbg('link_import value: {}'.format(self.import_link))
# "listfiles" specifics
self.listfiles_templateonly = self.args['--template']
# "install" specifics
self.install_force_action = self.args['--force-actions']
self.install_temporary = self.args['--temp']
self.install_keys = self.args['<key>']
self.install_diff = not self.args['--nodiff']
self.install_showdiff = self.showdiff or self.args['--showdiff']
self.install_backup_suffix = BACKUP_SUFFIX
self.install_default_actions = self.default_actions
self.install_default_actions_pre = [a for a in self.default_actions
if a.kind == Action.pre]
self.install_default_actions_post = [a for a in self.default_actions
if a.kind == Action.post]
# "compare" specifics
self.compare_dopts = self.args['--dopts']
self.compare_focus = self.args['--file']
@@ -239,30 +239,31 @@ class Options(AttrMonitor):
self.update_showpatch = self.args['--show-patch']
# "detail" specifics
self.detail_keys = self.args['<key>']
# "remove" specifics
self.remove_path = self.args['<path>']
self.remove_iskey = self.args['--key']
def _fill_attr(self):
"""create attributes from conf"""
# variables
self.variables = self.conf.get_variables(self.profile,
debug=self.debug).copy()
self.variables = self.conf.get_variables()
# the dotfiles
self.dotfiles = self.conf.eval_dotfiles(self.profile, self.variables,
debug=self.debug).copy()
self.dotfiles = self.conf.get_dotfiles(self.profile)
# the profiles
self.profiles = self.conf.get_profiles()
def _print_attr(self):
"""print all of this class attributes"""
def _debug_attr(self):
"""debug display all of this class attributes"""
if not self.debug:
return
self.log.dbg('options:')
self.log.dbg('CLI options:')
for att in dir(self):
if att.startswith('_'):
continue
val = getattr(self, att)
if callable(val):
continue
self.log.dbg('- {}: \"{}\"'.format(att, val))
self.log.dbg('- {}: {}'.format(att, val))
def _attr_set(self, attr):
"""error when some inexistent attr is set"""

62
dotdrop/profile.py Normal file
View File

@@ -0,0 +1,62 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
represent a profile in dotdrop
"""
from dotdrop.dictparser import DictParser
from dotdrop.action import Action
class Profile(DictParser):
# profile keys
key_include = 'include'
key_import = 'import'
def __init__(self, key, actions=[], dotfiles=[],
variables=[], dynvariables=[]):
"""
constructor
@key: profile key
@actions: list of action keys
@dotfiles: list of dotfile keys
@variables: list of variable keys
@dynvariables: list of interpreted variable keys
"""
self.key = key
self.actions = actions
self.dotfiles = dotfiles
self.variables = variables
self.dynvariables = dynvariables
def get_pre_actions(self):
"""return all 'pre' actions"""
return [a for a in self.actions if a.kind == Action.pre]
def get_post_actions(self):
"""return all 'post' actions"""
return [a for a in self.actions if a.kind == Action.post]
@classmethod
def _adjust_yaml_keys(cls, value):
"""patch dict"""
value.pop(cls.key_import, None)
value.pop(cls.key_include, None)
return value
def __eq__(self, other):
return self.__dict__ == other.__dict__
def __hash__(self):
return (hash(self.key) ^
hash(tuple(self.dotfiles)) ^
hash(tuple(self.included_profiles)))
def __str__(self):
msg = 'key:"{}"'
return msg.format(self.key)
def __repr__(self):
return 'profile({!s})'.format(self)

91
dotdrop/settings.py Normal file
View File

@@ -0,0 +1,91 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
settings block
"""
# local imports
from dotdrop.linktypes import LinkTypes
from dotdrop.dictparser import DictParser
class Settings(DictParser):
# key in yaml file
key_yaml = 'config'
# settings item keys
key_backup = 'backup'
key_banner = 'banner'
key_cmpignore = 'cmpignore'
key_create = 'create'
key_default_actions = 'default_actions'
key_dotpath = 'dotpath'
key_ignoreempty = 'ignoreempty'
key_keepdot = 'keepdot'
key_longkey = 'longkey'
key_link_dotfile_default = 'link_dotfile_default'
key_link_on_import = 'link_on_import'
key_showdiff = 'showdiff'
key_upignore = 'upignore'
key_workdir = 'workdir'
# import keys
key_import_actions = 'import_actions'
key_import_configs = 'import_configs'
key_import_variables = 'import_variables'
def __init__(self, backup=True, banner=True, cmpignore=[],
create=True, default_actions=[], dotpath='dotfiles',
ignoreempty=True, import_actions=[], import_configs=[],
import_variables=[], keepdot=False,
link_dotfile_default=LinkTypes.NOLINK,
link_on_import=LinkTypes.NOLINK, longkey=False,
showdiff=False, upignore=[], workdir='~/.config/dotdrop'):
self.backup = backup
self.banner = banner
self.create = create
self.cmpignore = cmpignore
self.default_actions = default_actions
self.dotpath = dotpath
self.ignoreempty = ignoreempty
self.import_actions = import_actions
self.import_configs = import_configs
self.import_variables = import_variables
self.keepdot = keepdot
self.longkey = longkey
self.showdiff = showdiff
self.upignore = upignore
self.workdir = workdir
self.link_dotfile_default = LinkTypes.get(link_dotfile_default)
self.link_on_import = LinkTypes.get(link_on_import)
def _serialize_seq(self, name, dic):
"""serialize attribute 'name' into 'dic'"""
seq = getattr(self, name)
dic[name] = seq
def serialize(self):
"""Return key-value pair representation of the settings"""
# Tedious, but less error-prone than introspection
dic = {
self.key_backup: self.backup,
self.key_banner: self.banner,
self.key_create: self.create,
self.key_dotpath: self.dotpath,
self.key_ignoreempty: self.ignoreempty,
self.key_keepdot: self.keepdot,
self.key_link_dotfile_default: str(self.link_dotfile_default),
self.key_link_on_import: str(self.link_on_import),
self.key_longkey: self.longkey,
self.key_showdiff: self.showdiff,
self.key_workdir: self.workdir,
}
self._serialize_seq(self.key_cmpignore, dic)
self._serialize_seq(self.key_default_actions, dic)
self._serialize_seq(self.key_import_actions, dic)
self._serialize_seq(self.key_import_configs, dic)
self._serialize_seq(self.key_import_variables, dic)
self._serialize_seq(self.key_upignore, dic)
return {self.key_yaml: dic}

View File

@@ -52,6 +52,8 @@ class Templategen:
self.env.globals['exists_in_path'] = jhelpers.exists_in_path
self.env.globals['basename'] = jhelpers.basename
self.env.globals['dirname'] = jhelpers.dirname
if self.debug:
self.log.dbg('template additional variables: {}'.format(variables))
def generate(self, src):
"""render template from path"""

View File

@@ -12,7 +12,8 @@ import filecmp
# local imports
from dotdrop.logger import Logger
from dotdrop.templategen import Templategen
import dotdrop.utils as utils
from dotdrop.utils import patch_ignores, remove, get_unique_tmp_name, \
write_to_tmpfile, must_ignore
TILD = '~'
@@ -20,12 +21,17 @@ TILD = '~'
class Updater:
def __init__(self, dotpath, dotfiles, variables, dry=False, safe=True,
def __init__(self, dotpath, variables,
dotfile_key_getter, dotfile_dst_getter,
dotfile_path_normalizer,
dry=False, safe=True,
debug=False, ignore=[], showpatch=False):
"""constructor
@dotpath: path where dotfiles are stored
@dotfiles: dotfiles for this profile
@variables: dictionary of variables for the templates
@dotfile_key_getter: func to get a dotfile by key
@dotfile_dst_getter: func to get a dotfile by dst
@dotfile_path_normalizer: func to normalize dotfile dst
@dry: simulate
@safe: ask for overwrite if True
@debug: enable debug
@@ -33,8 +39,10 @@ class Updater:
@showpatch: show patch if dotfile to update is a template
"""
self.dotpath = dotpath
self.dotfiles = dotfiles
self.variables = variables
self.dotfile_key_getter = dotfile_key_getter
self.dotfile_dst_getter = dotfile_dst_getter
self.dotfile_path_normalizer = dotfile_path_normalizer
self.dry = dry
self.safe = safe
self.debug = debug
@@ -48,8 +56,7 @@ class Updater:
if not os.path.lexists(path):
self.log.err('\"{}\" does not exist!'.format(path))
return False
path = self._normalize(path)
dotfile = self._get_dotfile_by_path(path)
dotfile = self.dotfile_dst_getter(path)
if not dotfile:
return False
if self.debug:
@@ -58,19 +65,20 @@ class Updater:
def update_key(self, key):
"""update the dotfile referenced by key"""
dotfile = self._get_dotfile_by_key(key)
dotfile = self.dotfile_key_getter(key)
if not dotfile:
return False
if self.debug:
self.log.dbg('updating {} from key \"{}\"'.format(dotfile, key))
path = self._normalize(dotfile.dst)
path = self.dotfile_path_normalizer(dotfile.dst)
return self._update(path, dotfile)
def _update(self, path, dotfile):
"""update dotfile from file pointed by path"""
ret = False
new_path = None
self.ignores = list(set(self.ignore + dotfile.upignore))
ignores = list(set(self.ignore + dotfile.upignore))
self.ignores = patch_ignores(ignores, dotfile.dst, debug=self.debug)
if self.debug:
self.log.dbg('ignore pattern(s): {}'.format(self.ignores))
@@ -81,74 +89,35 @@ class Updater:
if self._ignore([path, dtpath]):
self.log.sub('\"{}\" ignored'.format(dotfile.key))
return True
if dotfile.trans_w:
# apply write transformation if any
new_path = self._apply_trans_w(path, dotfile)
if not new_path:
return False
path = new_path
if os.path.isdir(path):
ret = self._handle_dir(path, dtpath)
# apply write transformation if any
new_path = self._apply_trans_w(path, dotfile)
if not new_path:
return False
if os.path.isdir(new_path):
ret = self._handle_dir(new_path, dtpath)
else:
ret = self._handle_file(path, dtpath)
ret = self._handle_file(new_path, dtpath)
# clean temporary files
if new_path and os.path.exists(new_path):
utils.remove(new_path)
if new_path != path and os.path.exists(new_path):
remove(new_path)
return ret
def _apply_trans_w(self, path, dotfile):
"""apply write transformation to dotfile"""
trans = dotfile.trans_w
trans = dotfile.get_trans_w()
if not trans:
return path
if self.debug:
self.log.dbg('executing write transformation {}'.format(trans))
tmp = utils.get_unique_tmp_name()
tmp = get_unique_tmp_name()
if not trans.transform(path, tmp):
msg = 'transformation \"{}\" failed for {}'
self.log.err(msg.format(trans.key, dotfile.key))
if os.path.exists(tmp):
utils.remove(tmp)
remove(tmp)
return None
return tmp
def _normalize(self, path):
"""normalize the path to match dotfile"""
path = os.path.expanduser(path)
path = os.path.expandvars(path)
path = os.path.abspath(path)
home = os.path.expanduser(TILD) + os.sep
# normalize the path
if path.startswith(home):
path = path[len(home):]
path = os.path.join(TILD, path)
return path
def _get_dotfile_by_key(self, key):
"""get the dotfile matching this key"""
dotfiles = self.dotfiles
subs = [d for d in dotfiles if d.key == key]
if not subs:
self.log.err('key \"{}\" not found!'.format(key))
return None
if len(subs) > 1:
found = ','.join([d.src for d in dotfiles])
self.log.err('multiple dotfiles found: {}'.format(found))
return None
return subs[0]
def _get_dotfile_by_path(self, path):
"""get the dotfile matching this path"""
dotfiles = self.dotfiles
subs = [d for d in dotfiles if d.dst == path]
if not subs:
self.log.err('\"{}\" is not managed!'.format(path))
return None
if len(subs) > 1:
found = ','.join([d.src for d in dotfiles])
self.log.err('multiple dotfiles found: {}'.format(found))
return None
return subs[0]
def _is_template(self, path):
if not Templategen.is_template(path):
if self.debug:
@@ -160,7 +129,7 @@ class Updater:
def _show_patch(self, fpath, tpath):
"""provide a way to manually patch the template"""
content = self._resolve_template(tpath)
tmp = utils.write_to_tmpfile(content)
tmp = write_to_tmpfile(content)
cmds = ['diff', '-u', tmp, fpath, '|', 'patch', tpath]
self.log.warn('try patching with: \"{}\"'.format(' '.join(cmds)))
return False
@@ -263,7 +232,7 @@ class Updater:
self.log.dbg('rm -r {}'.format(old))
if not self._confirm_rm_r(old):
continue
utils.remove(old)
remove(old)
self.log.sub('\"{}\" dir removed'.format(old))
# handle files diff
@@ -315,7 +284,7 @@ class Updater:
continue
if self.debug:
self.log.dbg('rm {}'.format(new))
utils.remove(new)
remove(new)
self.log.sub('\"{}\" removed'.format(new))
# Recursively decent into common subdirectories.
@@ -340,7 +309,7 @@ class Updater:
return True
def _ignore(self, paths):
if utils.must_ignore(paths, self.ignores, debug=self.debug):
if must_ignore(paths, self.ignores, debug=self.debug):
if self.debug:
self.log.dbg('ignoring update for {}'.format(paths))
return True

View File

@@ -17,6 +17,7 @@ from shutil import rmtree
from dotdrop.logger import Logger
LOG = Logger()
STAR = '*'
def run(cmd, raw=True, debug=False, checkerr=False):
@@ -48,8 +49,12 @@ def write_to_tmpfile(content):
def shell(cmd):
"""run a command in the shell (expects a string)"""
return subprocess.getoutput(cmd)
"""
run a command in the shell (expects a string)
returns True|False, output
"""
ret, out = subprocess.getstatusoutput(cmd)
return ret == 0, out
def diff(src, dst, raw=True, opts='', debug=False):
@@ -66,7 +71,7 @@ def get_tmpdir():
def get_tmpfile():
"""create a temporary file"""
(fd, path) = tempfile.mkstemp(prefix='dotdrop-')
(_, path) = tempfile.mkstemp(prefix='dotdrop-')
return path
@@ -123,6 +128,8 @@ def must_ignore(paths, ignores, debug=False):
"""return true if any paths in list matches any ignore patterns"""
if not ignores:
return False
if debug:
LOG.dbg('must ignore? {} against {}'.format(paths, ignores))
for p in paths:
for i in ignores:
if fnmatch.fnmatch(p, i):
@@ -130,3 +137,35 @@ def must_ignore(paths, ignores, debug=False):
LOG.dbg('ignore \"{}\" match: {}'.format(i, p))
return True
return False
def uniq_list(a_list):
"""unique elements of a list while preserving order"""
new = []
for a in a_list:
if a not in new:
new.append(a)
return new
def patch_ignores(ignores, prefix, debug=False):
"""allow relative ignore pattern"""
new = []
if debug:
LOG.dbg('ignores before patching: {}'.format(ignores))
for ignore in ignores:
if os.path.isabs(ignore):
# is absolute
new.append(ignore)
continue
if STAR in ignore:
if ignore.startswith(STAR) or ignore.startswith(os.sep):
# is glob
new.append(ignore)
continue
# patch upignore
path = os.path.join(prefix, ignore)
new.append(path)
if debug:
LOG.dbg('ignores after patching: {}'.format(new))
return new

View File

@@ -9,7 +9,7 @@ arch=('any')
url="https://github.com/deadc0de6/dotdrop"
license=('GPL')
groups=()
depends=('python' 'python-setuptools' 'python-jinja' 'python-docopt' 'python-pyaml')
depends=('python' 'python-setuptools' 'python-jinja' 'python-docopt' 'python-ruamel-yaml')
makedepends=('git')
provides=(dotdrop)
conflicts=(dotdrop)

View File

@@ -10,7 +10,7 @@ pkgbase = dotdrop
depends = python-setuptools
depends = python-jinja
depends = python-docopt
depends = python-pyaml
depends = python-ruamel-yaml
source = git+https://github.com/deadc0de6/dotdrop.git#tag=v0.28.0
md5sums = SKIP

View File

@@ -8,7 +8,7 @@ arch=('any')
url="https://github.com/deadc0de6/dotdrop"
license=('GPL')
groups=()
depends=('python' 'python-setuptools' 'python-jinja' 'python-docopt' 'python-pyaml')
depends=('python' 'python-setuptools' 'python-jinja' 'python-docopt' 'python-ruamel-yaml')
makedepends=('git')
source=("git+https://github.com/deadc0de6/dotdrop.git#tag=v${pkgver}")
md5sums=('SKIP')

View File

@@ -1,3 +1,3 @@
Jinja2; python_version >= '3.0'
docopt; python_version >= '3.0'
PyYAML; python_version >= '3.0'
Jinja2; python_version > '3.4'
docopt; python_version > '3.4'
ruamel.yaml; python_version > '3.4'

View File

@@ -13,7 +13,7 @@ usage example:
from docopt import docopt
import sys
import os
import yaml
from ruamel.yaml import YAML as yaml
USAGE = """
change-link.py
@@ -42,7 +42,7 @@ def main():
ignores = args['--ignore']
with open(path, 'r') as f:
content = yaml.load(f)
content = yaml(typ='safe').load(f)
for k, v in content[key].items():
if k in ignores:
continue

View File

@@ -32,15 +32,15 @@ setup(
python_requires=REQUIRES_PYTHON,
classifiers=[
'Development Status :: 5 - Production/Stable',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'License :: OSI Approved :: GNU General Public License v3 (GPLv3)',
],
keywords='dotfiles jinja2',
packages=find_packages(exclude=['tests*']),
install_requires=['docopt', 'Jinja2', 'PyYAML'],
install_requires=['docopt', 'Jinja2', 'ruamel.yaml'],
extras_require={
'dev': ['check-manifest'],

View File

@@ -60,8 +60,10 @@ cat > ${cfg} << _EOF
actions:
pre:
preaction: echo 'pre' > ${tmpa}/pre
preaction2: echo 'pre2' > ${tmpa}/pre2
post:
postaction: echo 'post' > ${tmpa}/post
postaction2: echo 'post2' > ${tmpa}/post2
nakedaction: echo 'naked' > ${tmpa}/naked
config:
backup: true
@@ -75,6 +77,8 @@ dotfiles:
- preaction
- postaction
- nakedaction
- preaction2
- postaction2
profiles:
p1:
dotfiles:
@@ -86,7 +90,7 @@ _EOF
echo "test" > ${tmps}/dotfiles/abc
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpa}/pre ] && exit 1
@@ -95,6 +99,10 @@ grep pre ${tmpa}/pre >/dev/null
grep post ${tmpa}/post >/dev/null
[ ! -e ${tmpa}/naked ] && exit 1
grep naked ${tmpa}/naked >/dev/null
[ ! -e ${tmpa}/pre2 ] && exit 1
grep pre2 ${tmpa}/pre2 >/dev/null
[ ! -e ${tmpa}/post2 ] && exit 1
grep post2 ${tmpa}/post2 >/dev/null
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}

View File

@@ -0,0 +1,188 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test compare ignore relative
# returns 1 in case of error
#
# exit on first error
#set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# dotdrop directory
basedir=`mktemp -d --suffix='-dotdrop-tests'`
echo "[+] dotdrop dir: ${basedir}"
echo "[+] dotpath dir: ${basedir}/dotfiles"
# the dotfile to be imported
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# some files
mkdir -p ${tmpd}/{program,config,vscode}
touch ${tmpd}/program/a
touch ${tmpd}/config/a
touch ${tmpd}/vscode/extensions.txt
touch ${tmpd}/vscode/keybindings.json
# create the config file
cfg="${basedir}/config.yaml"
create_conf ${cfg} # sets token
# import
echo "[+] import"
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/program
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/config
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/vscode
# add files on filesystem
echo "[+] add files"
touch ${tmpd}/program/b
touch ${tmpd}/config/b
# expects diff
echo "[+] comparing normal - diffs expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose
ret="$?"
echo ${ret}
[ "${ret}" = "0" ] && exit 1
set -e
# expects one diff
patt="b"
echo "[+] comparing with ignore (pattern: ${patt}) - no diff expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose --ignore=${patt}
[ "$?" != "0" ] && exit 1
set -e
# adding ignore in dotfile
cfg2="${basedir}/config2.yaml"
sed '/d_config:/a \ \ \ \ cmpignore:\n\ \ \ \ - "b"' ${cfg} > ${cfg2}
#cat ${cfg2}
# expects one diff
echo "[+] comparing with ignore in dotfile - diff expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg2} --verbose
[ "$?" = "0" ] && exit 1
set -e
# adding ignore in dotfile
cfg2="${basedir}/config2.yaml"
sed '/d_config:/a \ \ \ \ cmpignore:\n\ \ \ \ - "b"' ${cfg} > ${cfg2}
sed -i '/d_program:/a \ \ \ \ cmpignore:\n\ \ \ \ - "b"' ${cfg2}
#cat ${cfg2}
# expects no diff
echo "[+] comparing with ignore in dotfile - no diff expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg2} --verbose
[ "$?" != "0" ] && exit 1
set -e
# update files
echo touched > ${tmpd}/vscode/extensions.txt
echo touched > ${tmpd}/vscode/keybindings.json
# expect two diffs
echo "[+] comparing - diff expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose -C ${tmpd}/vscode
[ "$?" = "0" ] && exit 1
set -e
# expects no diff
echo "[+] comparing with ignore in dotfile - no diff expected"
sed '/d_vscode:/a \ \ \ \ cmpignore:\n\ \ \ \ - "extensions.txt"\n\ \ \ \ - "keybindings.json"' ${cfg} > ${cfg2}
set +e
cd ${ddpath} | ${bin} compare -c ${cfg2} --verbose -C ${tmpd}/vscode
[ "$?" != "0" ] && exit 1
set -e
####################
# test for #149
####################
mkdir -p ${tmpd}/.zsh
touch ${tmpd}/.zsh/somefile
mkdir -p ${tmpd}/.zsh/plugins
touch ${tmpd}/.zsh/plugins/someplugin
echo "[+] import .zsh"
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/.zsh
# no diff expected
echo "[+] comparing .zsh"
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose -C ${tmpd}/.zsh --ignore=${patt}
[ "$?" != "0" ] && exit 1
# add some files
touch ${tmpd}/.zsh/plugins/ignore-1.zsh
touch ${tmpd}/.zsh/plugins/ignore-2.zsh
# expects diff
echo "[+] comparing .zsh with new files"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose -C ${tmpd}/.zsh
ret="$?"
echo ${ret}
[ "${ret}" = "0" ] && exit 1
set -e
# expects no diff
patt="plugins/ignore-*.zsh"
echo "[+] comparing with ignore (pattern: ${patt}) - no diff expected"
set +e
cd ${ddpath} | ${bin} compare -c ${cfg} --verbose -C ${tmpd}/.zsh --ignore=${patt}
[ "$?" != "0" ] && exit 1
set -e
# expects no diff
echo "[+] comparing with ignore in dotfile - no diff expected"
sed '/d_zsh:/a \ \ \ \ cmpignore:\n\ \ \ \ - "plugins/ignore-*.zsh"' ${cfg} > ${cfg2}
set +e
cd ${ddpath} | ${bin} compare -c ${cfg2} --verbose -C ${tmpd}/.zsh
[ "$?" != "0" ] && exit 1
set -e
## CLEANING
rm -rf ${basedir} ${tmpd}
echo "OK"
exit 0

View File

@@ -93,6 +93,7 @@ create_conf ${cfg} # sets token
echo "[+] import"
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/dir1
cd ${ddpath} | ${bin} import -c ${cfg} ${tmpd}/uniquefile
cat ${cfg}
# let's see the dotpath
#tree ${basedir}/dotfiles

View File

@@ -79,9 +79,9 @@ echo "cfgpath: {{@@ _dotdrop_cfgpath @@}}" >> ${tmps}/dotfiles/abc
echo "workdir: {{@@ _dotdrop_workdir @@}}" >> ${tmps}/dotfiles/abc
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
#cat ${tmpd}/abc
cat ${tmpd}/abc
grep "^dotpath: ${tmps}/dotfiles$" ${tmpd}/abc >/dev/null
grep "^cfgpath: ${tmps}/config.yaml$" ${tmpd}/abc >/dev/null

94
tests-ng/dotfile-no-src.sh Executable file
View File

@@ -0,0 +1,94 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test dotfiles with no 'src'
# returns 1 in case of error
#
# exit on first error
set -e
#set -v
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
echo "dotfiles source (dotpath): ${tmps}"
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
echo "dotfiles destination: ${tmpd}"
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
abc:
dst: ${tmpd}/abc
profiles:
p1:
dotfiles:
- ALL
_EOF
#cat ${cfg}
# create the dotfiles
echo "abc" > ${tmps}/dotfiles/abc
###########################
# test install and compare
###########################
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -b -V
[ "$?" != "0" ] && exit 1
# checks
[ ! -e ${tmpd}/abc ] && exit 1
grep 'abc' ${tmpd}/abc
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpx} ${tmpy}
echo "OK"
exit 0

View File

@@ -81,7 +81,7 @@ cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpd}/abc ] && echo 'dotfile not installed' && exit 1
#cat ${tmpd}/abc
cat ${tmpd}/abc
grep "src:${tmps}/dotfiles/abc" ${tmpd}/abc >/dev/null
grep "dst:${tmpd}/abc" ${tmpd}/abc >/dev/null
grep "key:f_abc" ${tmpd}/abc >/dev/null

View File

@@ -96,7 +96,7 @@ _EOF
echo "test" > ${tmps}/dotfiles/abc
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpa}/pre ] && exit 1

123
tests-ng/force-actions.sh Executable file
View File

@@ -0,0 +1,123 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# force actions
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the action temp
tmpa=`mktemp -d --suffix='-dotdrop-tests'`
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
actions:
pre:
preaction: echo 'pre' > ${tmpa}/pre
preaction2: echo 'pre2' > ${tmpa}/pre2
post:
postaction: echo 'post' > ${tmpa}/post
postaction2: echo 'post2' > ${tmpa}/post2
nakedaction: echo 'naked' > ${tmpa}/naked
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
actions:
- preaction
- postaction
- nakedaction
- preaction2
- postaction2
profiles:
p1:
dotfiles:
- f_abc
_EOF
#cat ${cfg}
# create the dotfile
echo "test" > ${tmps}/dotfiles/abc
# deploy the dotfile
cp ${tmps}/dotfiles/abc ${tmpd}/abc
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
# checks
[ -e ${tmpa}/pre ] && exit 1
[ -e ${tmpa}/post ] && exit 1
[ -e ${tmpa}/naked ] && exit 1
[ -e ${tmpa}/pre2 ] && exit 1
[ -e ${tmpa}/post2 ] && exit 1
# install and force
cd ${ddpath} | ${bin} install -f -a -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpa}/pre ] && exit 1
grep pre ${tmpa}/pre >/dev/null
[ ! -e ${tmpa}/post ] && exit 1
grep post ${tmpa}/post >/dev/null
[ ! -e ${tmpa}/naked ] && exit 1
grep naked ${tmpa}/naked >/dev/null
[ ! -e ${tmpa}/pre2 ] && exit 1
grep pre2 ${tmpa}/pre2 >/dev/null
[ ! -e ${tmpa}/post2 ] && exit 1
grep post2 ${tmpa}/post2 >/dev/null
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}
echo "OK"
exit 0

113
tests-ng/globs.sh Executable file
View File

@@ -0,0 +1,113 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# ensure imports allow globs
# - import_actions
# - import_configs
# - import_variables
# - profile import
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# temporary
tmpa=`mktemp -d --suffix='-dotdrop-tests'`
###########
# test globs in import_actions
###########
# create the action files
actionsd="${tmps}/actions"
mkdir -p ${actionsd}
cat > ${actionsd}/action1.yaml << _EOF
actions:
fromaction1: echo "fromaction1" > ${tmpa}/fromaction1
_EOF
cat > ${actionsd}/action2.yaml << _EOF
actions:
fromaction2: echo "fromaction2" > ${tmpa}/fromaction2
_EOF
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
import_actions:
- ${actionsd}/*
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
actions:
- fromaction1
- fromaction2
profiles:
p1:
dotfiles:
- f_abc
_EOF
# create the source
mkdir -p ${tmps}/dotfiles/
echo "abc" > ${tmps}/dotfiles/abc
# install
cd ${ddpath} | ${bin} install -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpd}/abc ] && echo "dotfile not installed" && exit 1
[ ! -e ${tmpa}/fromaction1 ] && echo "action1 not executed" && exit 1
grep fromaction1 ${tmpa}/fromaction1
[ ! -e ${tmpa}/fromaction2 ] && echo "action2 not executed" && exit 1
grep fromaction2 ${tmpa}/fromaction2
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}
echo "OK"
exit 0

130
tests-ng/import-configs.sh Executable file
View File

@@ -0,0 +1,130 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# import config testing
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg1="${tmps}/config1.yaml"
cfg2="${tmps}/config2.yaml"
cat > ${cfg1} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
import_configs:
- ${cfg2}
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_zzz:
dst: ${tmpd}/zzz
src: zzz
f_sub:
dst: ${tmpd}/sub
src: sub
profiles:
p0:
include:
- p2
p1:
dotfiles:
- f_abc
p3:
dotfiles:
- f_zzz
pup:
include:
- psubsub
_EOF
cat > ${cfg2} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
f_def:
dst: ${tmpd}/def
src: def
f_ghi:
dst: ${tmpd}/ghi
src: ghi
profiles:
p2:
dotfiles:
- f_def
psubsub:
dotfiles:
- f_sub
_EOF
# create the source
mkdir -p ${tmps}/dotfiles/
echo "abc" > ${tmps}/dotfiles/abc
echo "def" > ${tmps}/dotfiles/def
echo "ghi" > ${tmps}/dotfiles/ghi
echo "zzz" > ${tmps}/dotfiles/zzz
echo "sub" > ${tmps}/dotfiles/sub
# install
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p p0 -V | grep f_def
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p p1 -V | grep f_abc
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p p2 -V | grep f_def
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p p3 -V | grep f_zzz
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p pup -V | grep f_sub
cd ${ddpath} | ${bin} listfiles -c ${cfg1} -p psubsub -V | grep f_sub
## CLEANING
rm -rf ${tmps} ${tmpd}
echo "OK"
exit 0

View File

@@ -0,0 +1,127 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2017, deadc0de6
#
# test the use of the keyword "import" in profiles
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
extdotfiles="${tmps}/df_p1.yaml"
dynextdotfiles_name="d_uid_dynvar"
dynextdotfiles="${tmps}/ext_${dynextdotfiles_name}"
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
dynvariables:
d_uid: "echo ${dynextdotfiles_name}"
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_def:
dst: ${tmpd}/def
src: def
f_xyz:
dst: ${tmpd}/xyz
src: xyz
f_dyn:
dst: ${tmpd}/dyn
src: dyn
profiles:
p1:
dotfiles:
- f_abc
import:
- $(basename ${extdotfiles})
- "ext_{{@@ d_uid @@}}"
_EOF
# create the external dotfile file
cat > ${extdotfiles} << _EOF
dotfiles:
- f_def
- f_xyz
_EOF
cat > ${dynextdotfiles} << _EOF
dotfiles:
- f_dyn
_EOF
# create the source
mkdir -p ${tmps}/dotfiles/
echo "abc" > ${tmps}/dotfiles/abc
echo "def" > ${tmps}/dotfiles/def
echo "xyz" > ${tmps}/dotfiles/xyz
echo "dyn" > ${tmps}/dotfiles/dyn
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
# checks
[ ! -e ${tmpd}/abc ] && exit 1
[ ! -e ${tmpd}/def ] && exit 1
[ ! -e ${tmpd}/xyz ] && exit 1
[ ! -e ${tmpd}/dyn ] && exit 1
echo 'file found'
grep 'abc' ${tmpd}/abc >/dev/null 2>&1
grep 'def' ${tmpd}/def >/dev/null 2>&1
grep 'xyz' ${tmpd}/xyz >/dev/null 2>&1
grep 'dyn' ${tmpd}/dyn >/dev/null 2>&1
## CLEANING
rm -rf ${tmps} ${tmpd}
echo "OK"
exit 0

View File

@@ -1,9 +1,8 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2017, deadc0de6
# Copyright (c) 2019, deadc0de6
#
# test the use of the keyword "import" in profiles
# returns 1 in case of error
# test basic import
#
# exit on first error
@@ -50,10 +49,13 @@ tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
extdotfiles="${tmps}/df_p1.yaml"
#echo "dotfile destination: ${tmpd}"
dynextdotfiles_name="d_uid_dynvar"
dynextdotfiles="${tmps}/ext_${dynextdotfiles_name}"
# create the dotfile
mkdir -p ${tmpd}/adir
echo "adir/file1" > ${tmpd}/adir/file1
echo "adir/fil2" > ${tmpd}/adir/file2
echo "file3" > ${tmpd}/file3
# create the config file
cfg="${tmps}/config.yaml"
@@ -63,61 +65,30 @@ config:
backup: true
create: true
dotpath: dotfiles
dynvariables:
d_uid: "echo ${dynextdotfiles_name}"
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_def:
dst: ${tmpd}/def
src: def
f_xyz:
dst: ${tmpd}/xyz
src: xyz
f_dyn:
dst: ${tmpd}/dyn
src: dyn
profiles:
p1:
dotfiles:
- f_abc
import:
- $(basename ${extdotfiles})
- "ext_{{@@ d_uid @@}}"
_EOF
#cat ${cfg}
# create the external dotfile file
cat > ${extdotfiles} << _EOF
dotfiles:
- f_def
- f_xyz
_EOF
# import
cd ${ddpath} | ${bin} import -c ${cfg} -p p1 -V ${tmpd}/adir
cd ${ddpath} | ${bin} import -c ${cfg} -p p1 -V ${tmpd}/file3
cat > ${dynextdotfiles} << _EOF
dotfiles:
- f_dyn
_EOF
cat ${cfg}
# create the source
mkdir -p ${tmps}/dotfiles/
echo "abc" > ${tmps}/dotfiles/abc
echo "def" > ${tmps}/dotfiles/def
echo "xyz" > ${tmps}/dotfiles/xyz
echo "dyn" > ${tmps}/dotfiles/dyn
# ensure exists and is not link
[ ! -d ${tmps}/dotfiles/${tmpd}/adir ] && echo "not a directory" && exit 1
[ ! -e ${tmps}/dotfiles/${tmpd}/adir/file1 ] && echo "not exist" && exit 1
[ ! -e ${tmps}/dotfiles/${tmpd}/adir/file2 ] && echo "not exist" && exit 1
[ ! -e ${tmps}/dotfiles/${tmpd}/file3 ] && echo "not a file" && exit 1
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
cat ${cfg} | grep ${tmpd}/adir >/dev/null 2>&1
cat ${cfg} | grep ${tmpd}/file3 >/dev/null 2>&1
# checks
[ ! -e ${tmpd}/abc ] && exit 1
[ ! -e ${tmpd}/def ] && exit 1
[ ! -e ${tmpd}/xyz ] && exit 1
[ ! -e ${tmpd}/dyn ] && exit 1
grep 'abc' ${tmpd}/abc >/dev/null 2>&1
grep 'def' ${tmpd}/def >/dev/null 2>&1
grep 'xyz' ${tmpd}/xyz >/dev/null 2>&1
grep 'dyn' ${tmpd}/dyn >/dev/null 2>&1
nb=`cat ${cfg} | grep d_adir | wc -l`
[ "${nb}" != "2" ] && echo 'bad config1' && exit 1
nb=`cat ${cfg} | grep f_file3 | wc -l`
[ "${nb}" != "2" ] && echo 'bad config2' && exit 1
## CLEANING
rm -rf ${tmps} ${tmpd}

183
tests-ng/include-actions.sh Executable file
View File

@@ -0,0 +1,183 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test the use of the keyword "include"
# with action inheritance
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# the action temp
tmpa=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
actions:
pre:
preaction: echo 'pre' >> ${tmpa}/pre
preaction2: echo 'pre2' >> ${tmpa}/pre2
post:
postaction: echo 'post' >> ${tmpa}/post
postaction2: echo 'post2' >> ${tmpa}/post2
nakedaction: echo 'naked' >> ${tmpa}/naked
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
profiles:
p0:
include:
- p3
p1:
dotfiles:
- f_abc
actions:
- preaction
- postaction
p2:
include:
- p1
actions:
- preaction2
- postaction2
p3:
include:
- p2
actions:
- nakedaction
_EOF
# create the source
mkdir -p ${tmps}/dotfiles/
echo "test" > ${tmps}/dotfiles/abc
# install
echo "PROFILE p2"
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p2 -V
# checks
[ ! -e ${tmpa}/pre ] && echo "pre not found" && exit 1
nb=`wc -l ${tmpa}/pre | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre executed multiple times" && exit 1
[ ! -e ${tmpa}/pre2 ] && echo "pre2 not found" && exit 1
nb=`wc -l ${tmpa}/pre2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre2 executed multiple times" && exit 1
[ ! -e ${tmpa}/post ] && echo "post not found" && exit 1
nb=`wc -l ${tmpa}/post | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post executed multiple times" && exit 1
[ ! -e ${tmpa}/post2 ] && echo "post2 not found" && exit 1
nb=`wc -l ${tmpa}/post2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post2 executed multiple times" && exit 1
# install
rm -f ${tmpa}/pre ${tmpa}/pre2 ${tmpa}/post ${tmpa}/post2 ${tmpa}/naked
rm -f ${tmpd}/abc
echo "PROFILE p3"
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p3 -V
# checks
[ ! -e ${tmpa}/pre ] && echo "pre not found" && exit 1
nb=`wc -l ${tmpa}/pre | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre executed multiple times" && exit 1
[ ! -e ${tmpa}/pre2 ] && echo "pre2 not found" && exit 1
nb=`wc -l ${tmpa}/pre2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre2 executed multiple times" && exit 1
[ ! -e ${tmpa}/post ] && echo "post not found" && exit 1
nb=`wc -l ${tmpa}/post | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post executed multiple times" && exit 1
[ ! -e ${tmpa}/post2 ] && echo "post2 not found" && exit 1
nb=`wc -l ${tmpa}/post2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post2 executed multiple times" && exit 1
[ ! -e ${tmpa}/naked ] && echo "naked not found" && exit 1
nb=`wc -l ${tmpa}/naked | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "naked executed multiple times" && exit 1
# install
rm -f ${tmpa}/pre ${tmpa}/pre2 ${tmpa}/post ${tmpa}/post2 ${tmpa}/naked
rm -f ${tmpd}/abc
echo "PROFILE p0"
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p0 -V
# checks
[ ! -e ${tmpa}/pre ] && echo "pre not found" && exit 1
nb=`wc -l ${tmpa}/pre | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre executed multiple times" && exit 1
[ ! -e ${tmpa}/pre2 ] && echo "pre2 not found" && exit 1
nb=`wc -l ${tmpa}/pre2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "pre2 executed multiple times" && exit 1
[ ! -e ${tmpa}/post ] && echo "post not found" && exit 1
nb=`wc -l ${tmpa}/post | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post executed multiple times" && exit 1
[ ! -e ${tmpa}/post2 ] && echo "post2 not found" && exit 1
nb=`wc -l ${tmpa}/post2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "post2 executed multiple times" && exit 1
[ ! -e ${tmpa}/naked ] && echo "naked not found" && exit 1
nb=`wc -l ${tmpa}/naked | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "naked executed multiple times" && exit 1
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}
echo "OK"
exit 0

142
tests-ng/include-order.sh Executable file
View File

@@ -0,0 +1,142 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test the use of the keyword "include"
# that has to be ordered
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# temporary
tmpa=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
actions:
pre:
first: 'echo first > ${tmpa}/cookie'
second: 'echo second >> ${tmpa}/cookie'
third: 'echo third >> ${tmpa}/cookie'
dotfiles:
f_first:
dst: ${tmpd}/first
src: first
actions:
- first
f_second:
dst: ${tmpd}/second
src: second
actions:
- second
f_third:
dst: ${tmpd}/third
src: third
actions:
- third
profiles:
p0:
dotfiles:
- f_first
include:
- second
- third
second:
dotfiles:
- f_second
third:
dotfiles:
- f_third
_EOF
# create the source
mkdir -p ${tmps}/dotfiles/
echo "first" > ${tmps}/dotfiles/first
echo "second" > ${tmps}/dotfiles/second
echo "third" > ${tmps}/dotfiles/third
attempts="3"
for ((i=0;i<${attempts};i++)); do
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p0 -V
# checks timestamp
echo "first timestamp: `stat -c %y ${tmpd}/first`"
echo "second timestamp: `stat -c %y ${tmpd}/second`"
echo "third timestamp: `stat -c %y ${tmpd}/third`"
ts_first=`date "+%S%N" -d "$(stat -c %y ${tmpd}/first)"`
ts_second=`date "+%S%N" -d "$(stat -c %y ${tmpd}/second)"`
ts_third=`date "+%S%N" -d "$(stat -c %y ${tmpd}/third)"`
#echo "first ts: ${ts_first}"
#echo "second ts: ${ts_second}"
#echo "third ts: ${ts_third}"
[ "${ts_first}" -ge "${ts_second}" ] && echo "second created before first" && exit 1
[ "${ts_second}" -ge "${ts_third}" ] && echo "third created before second" && exit 1
# check cookie
cat ${tmpa}/cookie
content=`cat ${tmpa}/cookie | xargs`
[ "${content}" != "first second third" ] && echo "bad cookie" && exit 1
# clean
rm ${tmpa}/cookie
rm ${tmpd}/first ${tmpd}/second ${tmpd}/third
done
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}
echo "OK"
exit 0

View File

@@ -64,12 +64,18 @@ dotfiles:
dst: ${tmpd}/abc
src: abc
profiles:
p0:
include:
- p3
p1:
dotfiles:
- f_abc
p2:
include:
- p1
p3:
include:
- p2
_EOF
# create the source
@@ -82,6 +88,14 @@ cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1
# compare
cd ${ddpath} | ${bin} compare -c ${cfg} -p p1
cd ${ddpath} | ${bin} compare -c ${cfg} -p p2
cd ${ddpath} | ${bin} compare -c ${cfg} -p p3
cd ${ddpath} | ${bin} compare -c ${cfg} -p p0
# list
cd ${ddpath} | ${bin} listfiles -c ${cfg} -p p1 | grep f_abc
cd ${ddpath} | ${bin} listfiles -c ${cfg} -p p2 | grep f_abc
cd ${ddpath} | ${bin} listfiles -c ${cfg} -p p3 | grep f_abc
cd ${ddpath} | ${bin} listfiles -c ${cfg} -p p0 | grep f_abc
# count
cnt=`cd ${ddpath} | ${bin} listfiles -c ${cfg} -p p1 -b | grep '^f_' | wc -l`

134
tests-ng/profile-actions.sh Executable file
View File

@@ -0,0 +1,134 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test actions per profile
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
#echo "dotfile destination: ${tmpd}"
# the action temp
tmpa=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
actions:
pre:
preaction: echo 'pre' >> ${tmpa}/pre
preaction2: echo 'pre2' >> ${tmpa}/pre2
post:
postaction: echo 'post' >> ${tmpa}/post
postaction2: echo 'post2' >> ${tmpa}/post2
nakedaction: echo 'naked' >> ${tmpa}/naked
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_def:
dst: ${tmpd}/def
src: def
f_ghi:
dst: ${tmpd}/ghi
src: ghi
profiles:
p0:
actions:
- preaction2
- postaction2
- nakedaction
dotfiles:
- f_abc
- f_def
- f_ghi
_EOF
#cat ${cfg}
# create the dotfile
echo "test" > ${tmps}/dotfiles/abc
echo "test" > ${tmps}/dotfiles/def
echo "test" > ${tmps}/dotfiles/ghi
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p0 -V
# check actions executed
[ ! -e ${tmpa}/pre2 ] && echo 'action not executed' && exit 1
[ ! -e ${tmpa}/post2 ] && echo 'action not executed' && exit 1
[ ! -e ${tmpa}/naked ] && echo 'action not executed' && exit 1
grep pre2 ${tmpa}/pre2
nb=`wc -l ${tmpa}/pre2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "profile action executed multiple times" && exit 1
grep post2 ${tmpa}/post2
nb=`wc -l ${tmpa}/post2 | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "profile action executed multiple times" && exit 1
grep naked ${tmpa}/naked
nb=`wc -l ${tmpa}/naked | awk '{print $1}'`
[ "${nb}" != "1" ] && echo "profile action executed multiple times" && exit 1
# install again
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p0 -V
# check actions not executed twice
nb=`wc -l ${tmpa}/post2 | awk '{print $1}'`
[ "${nb}" -gt "1" ] && echo "action post2 executed twice" && exit 1
nb=`wc -l ${tmpa}/naked | awk '{print $1}'`
[ "${nb}" -gt "1" ] && echo "action naked executed twice" && exit 1
## CLEANING
rm -rf ${tmps} ${tmpd} ${tmpa}
echo "OK"
exit 0

179
tests-ng/remove.sh Executable file
View File

@@ -0,0 +1,179 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test remove
# returns 1 in case of error
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# dotdrop directory
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile to be imported
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_def:
dst: ${tmpd}/def
src: def
f_last:
dst: ${tmpd}/last
src: last
profiles:
p1:
dotfiles:
- f_abc
- f_def
p2:
dotfiles:
- f_def
last:
dotfiles:
- f_last
_EOF
cfgbak="${tmps}/config.yaml.bak"
cp ${cfg} ${cfgbak}
# create the dotfile
echo "abc" > ${tmps}/dotfiles/abc
echo "abc" > ${tmpd}/abc
echo "def" > ${tmps}/dotfiles/def
echo "def" > ${tmpd}/def
# remove with bad profile
cd ${ddpath} | ${bin} remove -f -k -p empty -c ${cfg} f_abc -V
[ ! -e ${tmps}/dotfiles/abc ] && echo "dotfile in dotpath deleted" && exit 1
[ ! -e ${tmpd}/abc ] && echo "source dotfile deleted" && exit 1
[ ! -e ${tmps}/dotfiles/def ] && echo "dotfile in dotpath deleted" && exit 1
[ ! -e ${tmpd}/def ] && echo "source dotfile deleted" && exit 1
# ensure config not altered
diff ${cfg} ${cfgbak}
# remove by key
echo "[+] remove f_abc by key"
cd ${ddpath} | ${bin} remove -p p1 -f -k -c ${cfg} f_abc -V
cat ${cfg}
echo "[+] remove f_def by key"
cd ${ddpath} | ${bin} remove -p p2 -f -k -c ${cfg} f_def -V
cat ${cfg}
# checks
[ -e ${tmps}/dotfiles/abc ] && echo "dotfile in dotpath not deleted" && exit 1
[ ! -e ${tmpd}/abc ] && echo "source dotfile deleted" && exit 1
[ -e ${tmps}/dotfiles/def ] && echo "dotfile in dotpath not deleted" && exit 1
[ ! -e ${tmpd}/def ] && echo "source dotfile deleted" && exit 1
echo "[+] ========="
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
f_def:
dst: ${tmpd}/def
src: def
f_last:
dst: ${tmpd}/last
src: last
profiles:
p1:
dotfiles:
- f_abc
- f_def
p2:
dotfiles:
- f_def
last:
dotfiles:
- f_last
_EOF
cat ${cfg}
# create the dotfile
echo "abc" > ${tmps}/dotfiles/abc
echo "abc" > ${tmpd}/abc
echo "def" > ${tmps}/dotfiles/def
echo "def" > ${tmpd}/def
# remove by key
echo "[+] remove f_abc by path"
cd ${ddpath} | ${bin} remove -p p1 -f -c ${cfg} ${tmpd}/abc -V
cat ${cfg}
echo "[+] remove f_def by path"
cd ${ddpath} | ${bin} remove -p p2 -f -c ${cfg} ${tmpd}/def -V
cat ${cfg}
# checks
[ -e ${tmps}/dotfiles/abc ] && echo "(2) dotfile in dotpath not deleted" && exit 1
[ ! -e ${tmpd}/abc ] && echo "(2) source dotfile deleted" && exit 1
[ -e ${tmps}/dotfiles/def ] && echo "(2) dotfile in dotpath not deleted" && exit 1
[ ! -e ${tmpd}/def ] && echo "(2) source dotfile deleted" && exit 1
cat ${cfg}
## CLEANING
rm -rf ${tmps} ${tmpd}
echo "OK"
exit 0

221
tests-ng/symlink.sh Executable file
View File

@@ -0,0 +1,221 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test symlinking dotfiles
#
# exit on first error
set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# the dotfile source
tmps=`mktemp -d --suffix='-dotdrop-tests'`
mkdir -p ${tmps}/dotfiles
# the dotfile destination
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
#echo "dotfile destination: ${tmpd}"
##################################################
# test symlink directory
##################################################
# create the dotfile
mkdir -p ${tmps}/dotfiles/abc
echo "file1" > ${tmps}/dotfiles/abc/file1
echo "file2" > ${tmps}/dotfiles/abc/file2
# create a shell script
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: nolink
dotfiles:
d_abc:
dst: ${tmpd}/abc
src: abc
link: link
profiles:
p1:
dotfiles:
- d_abc
_EOF
#cat ${cfg}
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
#cat ${cfg}
# ensure exists and is link
[ ! -h ${tmpd}/abc ] && echo "not a symlink" && exit 1
[ ! -e ${tmpd}/abc/file1 ] && echo "does not exist" && exit 1
[ ! -e ${tmpd}/abc/file2 ] && echo "does not exist" && exit 1
##################################################
# test symlink files
##################################################
# clean
rm -rf ${tmps}/dotfiles ${tmpd}/abc
# create the dotfiles
mkdir -p ${tmps}/dotfiles/
echo "abc" > ${tmps}/dotfiles/abc
# create a shell script
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: nolink
dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
link: link
profiles:
p1:
dotfiles:
- f_abc
_EOF
#cat ${cfg}
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
#cat ${cfg}
# ensure exists and is link
[ ! -h ${tmpd}/abc ] && echo "not a symlink" && exit 1
##################################################
# test link_children
##################################################
# clean
rm -rf ${tmps}/dotfiles ${tmpd}/abc
# create the dotfile
mkdir -p ${tmps}/dotfiles/abc
echo "file1" > ${tmps}/dotfiles/abc/file1
echo "file2" > ${tmps}/dotfiles/abc/file2
# create a shell script
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: nolink
dotfiles:
d_abc:
dst: ${tmpd}/abc
src: abc
link: link_children
profiles:
p1:
dotfiles:
- d_abc
_EOF
#cat ${cfg}
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
#cat ${cfg}
# ensure exists and is link
[ ! -d ${tmpd}/abc ] && echo "not a symlink" && exit 1
[ ! -h ${tmpd}/abc/file1 ] && echo "does not exist" && exit 1
[ ! -h ${tmpd}/abc/file2 ] && echo "does not exist" && exit 1
##################################################
# test link_children with templates
##################################################
# clean
rm -rf ${tmps}/dotfiles ${tmpd}/abc
# create the dotfile
mkdir -p ${tmps}/dotfiles/abc
echo "{{@@ profile @@}}" > ${tmps}/dotfiles/abc/file1
echo "file2" > ${tmps}/dotfiles/abc/file2
# create a shell script
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: nolink
dotfiles:
d_abc:
dst: ${tmpd}/abc
src: abc
link: link_children
profiles:
p1:
dotfiles:
- d_abc
_EOF
#cat ${cfg}
# install
cd ${ddpath} | ${bin} install -f -c ${cfg} -p p1 -V
#cat ${cfg}
# ensure exists and is link
[ ! -d ${tmpd}/abc ] && echo "not a symlink" && exit 1
[ ! -h ${tmpd}/abc/file1 ] && echo "does not exist" && exit 1
[ ! -h ${tmpd}/abc/file2 ] && echo "does not exist" && exit 1
grep '^p1$' ${tmpd}/abc/file1
## CLEANING
rm -rf ${tmps} ${tmpd} ${scr}
echo "OK"
exit 0

View File

@@ -164,6 +164,13 @@ set -e
# test update
###########################
# update single file
echo 'update' > ${tmpd}/def
cd ${ddpath} | ${bin} update -f -k -c ${cfg} -p p1 -b -V f_def
[ "$?" != "0" ] && exit 1
[ ! -e ${tmpd}/def ] && echo 'dotfile in FS removed' && exit 1
[ ! -e ${tmps}/dotfiles/def ] && echo 'dotfile in dotpath removed' && exit 1
# update single file
cd ${ddpath} | ${bin} update -f -k -c ${cfg} -p p1 -b -V f_abc
[ "$?" != "0" ] && exit 1

View File

@@ -0,0 +1,107 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2019, deadc0de6
#
# test ignore update relative pattern
# returns 1 in case of error
#
# exit on first error
#set -e
# all this crap to get current path
rl="readlink -f"
if ! ${rl} "${0}" >/dev/null 2>&1; then
rl="realpath"
if ! hash ${rl}; then
echo "\"${rl}\" not found !" && exit 1
fi
fi
cur=$(dirname "$(${rl} "${0}")")
#hash dotdrop >/dev/null 2>&1
#[ "$?" != "0" ] && echo "install dotdrop to run tests" && exit 1
#echo "called with ${1}"
# dotdrop path can be pass as argument
ddpath="${cur}/../"
[ "${1}" != "" ] && ddpath="${1}"
[ ! -d ${ddpath} ] && echo "ddpath \"${ddpath}\" is not a directory" && exit 1
export PYTHONPATH="${ddpath}:${PYTHONPATH}"
bin="python3 -m dotdrop.dotdrop"
echo "dotdrop path: ${ddpath}"
echo "pythonpath: ${PYTHONPATH}"
# get the helpers
source ${cur}/helpers
echo -e "\e[96m\e[1m==> RUNNING $(basename $BASH_SOURCE) <==\e[0m"
################################################################
# this is the test
################################################################
# dotdrop directory
tmps=`mktemp -d --suffix='-dotdrop-tests'`
dt="${tmps}/dotfiles"
mkdir -p ${dt}
mkdir -p ${dt}/a/{b,c}
echo 'a' > ${dt}/a/b/abfile
echo 'a' > ${dt}/a/c/acfile
# fs dotfiles
tmpd=`mktemp -d --suffix='-dotdrop-tests'`
cp -r ${dt}/a ${tmpd}/
# create the config file
cfg="${tmps}/config.yaml"
cat > ${cfg} << _EOF
config:
backup: false
create: true
dotpath: dotfiles
dotfiles:
f_abc:
dst: ${tmpd}/a
src: a
upignore:
- "cfile"
- "newfile"
- "newdir"
profiles:
p1:
dotfiles:
- f_abc
_EOF
#cat ${cfg}
#tree ${dt}
# edit/add files
echo "[+] edit/add files"
touch ${tmpd}/a/newfile
echo 'b' > ${tmpd}/a/c/acfile
mkdir -p ${tmpd}/a/newdir/b
touch ${tmpd}/a/newdir/b/c
#tree ${tmpd}/a
# update
echo "[+] update"
cd ${ddpath} | ${bin} update -f -c ${cfg} --verbose --profile=p1 --key f_abc
#tree ${dt}
# check files haven't been updated
grep 'b' ${dt}/a/c/acfile >/dev/null
[ -e ${dt}/a/newfile ] && exit 1
## CLEANING
rm -rf ${tmps} ${tmpd}
echo "OK"
exit 0

View File

@@ -1,5 +1,5 @@
pycodestyle; python_version >= '3.0'
nose; python_version >= '3.0'
coverage; python_version >= '3.0'
coveralls; python_version >= '3.0'
pyflakes; python_version >= '3.0'
pycodestyle; python_version > '3.4'
nose; python_version > '3.4'
coverage; python_version > '3.4'
coveralls; python_version > '3.4'
pyflakes; python_version > '3.4'

View File

@@ -8,7 +8,7 @@ set -ev
# PEP8 tests
which pycodestyle 2>/dev/null
[ "$?" != "0" ] && echo "Install pycodestyle" && exit 1
pycodestyle --ignore=W605 dotdrop/
pycodestyle --ignore=W503,W504,W605 dotdrop/
pycodestyle tests/
pycodestyle scripts/
@@ -35,7 +35,18 @@ PYTHONPATH=dotdrop ${nosebin} -s --with-coverage --cover-package=dotdrop
## execute bash script tests
[ "$1" = '--python-only' ] || {
for scr in tests-ng/*.sh; do
${scr}
done
log=`mktemp`
for scr in tests-ng/*.sh; do
${scr} 2>&1 | tee ${log}
set +e
if grep Traceback ${log}; then
echo "crash found in logs"
rm -f ${log}
exit 1
fi
set -e
done
rm -f ${log}
}
echo "All test finished successfully"

View File

@@ -11,9 +11,9 @@ import string
import tempfile
from unittest import TestCase
import yaml
from ruamel.yaml import YAML as yaml
from dotdrop.options import Options, ENV_NODEBUG
from dotdrop.options import Options
from dotdrop.linktypes import LinkTypes
from dotdrop.utils import strip_home
@@ -127,6 +127,7 @@ def _fake_args():
args['--key'] = False
args['--ignore'] = []
args['--show-patch'] = False
args['--force-actions'] = False
# cmds
args['list'] = False
args['listfiles'] = False
@@ -135,6 +136,7 @@ def _fake_args():
args['import'] = False
args['update'] = False
args['detail'] = False
args['remove'] = False
return args
@@ -144,6 +146,7 @@ def load_options(confpath, profile):
args = _fake_args()
args['--cfg'] = confpath
args['--profile'] = profile
args['--verbose'] = True
# and get the options
o = Options(args=args)
o.profile = profile
@@ -153,8 +156,6 @@ def load_options(confpath, profile):
o.import_link = LinkTypes.NOLINK
o.install_showdiff = True
o.debug = True
if ENV_NODEBUG in os.environ:
o.debug = False
o.compare_dopts = ''
o.variables = {}
return o
@@ -171,8 +172,14 @@ def get_dotfile_from_yaml(dic, path):
"""Return the dotfile from the yaml dictionary"""
# path is not the file in dotpath but on the FS
dotfiles = dic['dotfiles']
src = get_path_strip_version(path)
return [d for d in dotfiles.values() if d['src'] == src][0]
# src = get_path_strip_version(path)
home = os.path.expanduser('~')
if path.startswith(home):
path = path.replace(home, '~')
dotfile = [d for d in dotfiles.values() if d['dst'] == path]
if dotfile:
return dotfile[0]
return None
def yaml_dashed_list(items, indent=0):
@@ -214,9 +221,8 @@ def create_yaml_keyval(pairs, parent_dir=None, top_key=None):
if not parent_dir:
parent_dir = get_tempdir()
fd, file_name = tempfile.mkstemp(dir=parent_dir, suffix='.yaml', text=True)
with os.fdopen(fd, 'w') as f:
yaml.safe_dump(pairs, f)
_, file_name = tempfile.mkstemp(dir=parent_dir, suffix='.yaml', text=True)
yaml_dump(pairs, file_name)
return file_name
@@ -227,21 +233,18 @@ def populate_fake_config(config, dotfiles={}, profiles={}, actions={},
is_path = isinstance(config, str)
if is_path:
config_path = config
with open(config_path) as config_file:
config = yaml.safe_load(config_file)
config = yaml_load(config_path)
config['dotfiles'] = dotfiles
config['profiles'] = profiles
config['actions'] = actions
config['trans'] = trans
config['trans_read'] = trans
config['trans_write'] = trans_write
config['variables'] = variables
config['dynvariables'] = dynvariables
if is_path:
with open(config_path, 'w') as config_file:
yaml.safe_dump(config, config_file, default_flow_style=False,
indent=2)
yaml_dump(config, config_path)
def file_in_yaml(yaml_file, path, link=False):
@@ -249,17 +252,36 @@ def file_in_yaml(yaml_file, path, link=False):
strip = get_path_strip_version(path)
if isinstance(yaml_file, str):
with open(yaml_file) as f:
yaml_conf = yaml.safe_load(f)
yaml_conf = yaml_load(yaml_file)
else:
yaml_conf = yaml_file
dotfiles = yaml_conf['dotfiles'].values()
in_src = strip in (x['src'] for x in dotfiles)
in_src = any([x['src'].endswith(strip) for x in dotfiles])
in_dst = path in (os.path.expanduser(x['dst']) for x in dotfiles)
if link:
has_link = get_dotfile_from_yaml(yaml_conf, path)['link']
df = get_dotfile_from_yaml(yaml_conf, path)
has_link = False
if df:
has_link = 'link' in df
else:
return False
return in_src and in_dst and has_link
return in_src and in_dst
def yaml_load(path):
with open(path, 'r') as f:
content = yaml(typ='safe').load(f)
return content
def yaml_dump(content, path):
with open(path, 'w') as f:
y = yaml()
y.default_flow_style = False
y.indent = 2
y.typ = 'safe'
y.dump(content, f)

View File

@@ -29,7 +29,7 @@ class TestCompare(unittest.TestCase):
def compare(self, o, tmp, nbdotfiles):
dotfiles = o.dotfiles
self.assertTrue(len(dotfiles) == nbdotfiles)
t = Templategen(base=o.dotpath, debug=o.debug)
t = Templategen(base=o.dotpath, debug=True)
inst = Installer(create=o.create, backup=o.backup,
dry=o.dry, base=o.dotpath, debug=o.debug)
comp = Comparator()
@@ -109,6 +109,7 @@ class TestCompare(unittest.TestCase):
self.assertTrue(os.path.exists(confpath))
o = load_options(confpath, profile)
o.longkey = True
o.debug = True
dfiles = [d1, d2, d3, d4, d5, d9]
# import the files

View File

@@ -7,7 +7,6 @@ basic unittest for the import function
import unittest
import os
import yaml
from dotdrop.dotdrop import cmd_importer
from dotdrop.dotdrop import cmd_list_profiles
@@ -18,7 +17,8 @@ from dotdrop.linktypes import LinkTypes
from tests.helpers import (clean, create_dir, create_fake_config,
create_random_file, edit_content, file_in_yaml,
get_path_strip_version, get_string, get_tempdir,
load_options, populate_fake_config)
load_options, populate_fake_config,
yaml_load)
class TestImport(unittest.TestCase):
@@ -31,10 +31,7 @@ class TestImport(unittest.TestCase):
def load_yaml(self, path):
"""Load yaml to dict"""
self.assertTrue(os.path.exists(path))
content = ''
with open(path, 'r') as f:
content = yaml.load(f)
return content
return yaml_load(path)
def assert_file(self, path, o, profile):
"""Make sure path has been inserted in conf for profile"""
@@ -45,7 +42,7 @@ class TestImport(unittest.TestCase):
def assert_in_yaml(self, path, dic, link=False):
"""Make sure "path" is in the "dic" representing the yaml file"""
self.assertTrue(file_in_yaml(dic, path, link))
self.assertTrue(file_in_yaml(dic, path, link=link))
def test_import(self):
"""Test the import function"""
@@ -117,7 +114,7 @@ class TestImport(unittest.TestCase):
o = load_options(confpath, profile)
# test dotfiles in config class
self.assertTrue(profile in o.profiles)
self.assertTrue(profile in [p.key for p in o.profiles])
self.assert_file(dotfile1, o, profile)
self.assert_file(dotfile2, o, profile)
self.assert_file(dotfile3, o, profile)
@@ -194,6 +191,7 @@ class TestImport(unittest.TestCase):
edit_content(dotfile1, editcontent)
o.safe = False
o.update_path = [dotfile1]
o.debug = True
cmd_update(o)
c2 = open(indt1, 'r').read()
self.assertTrue(editcontent == c2)
@@ -218,9 +216,10 @@ class TestImport(unittest.TestCase):
self.assertTrue(os.path.exists(dotdrop_home))
self.addCleanup(clean, dotdrop_home)
dotpath_ed = 'imported'
imported = {
'config': {
'dotpath': 'imported',
'dotpath': dotpath_ed,
},
'dotfiles': {},
'profiles': {
@@ -250,9 +249,10 @@ class TestImport(unittest.TestCase):
'dv_log_ed': 'echo 5',
},
}
dotpath_ing = 'importing'
importing = {
'config': {
'dotpath': 'importing',
'dotpath': dotpath_ing,
},
'dotfiles': {},
'profiles': {
@@ -293,7 +293,7 @@ class TestImport(unittest.TestCase):
# create the importing base config file
importing_path = create_fake_config(dotdrop_home,
configname='config.yaml',
import_configs=('config-*.yaml',),
import_configs=['config-2.yaml'],
**importing['config'])
# edit the imported config
@@ -326,8 +326,10 @@ class TestImport(unittest.TestCase):
y = self.load_yaml(imported_path)
# testing dotfiles
self.assertTrue(all(file_in_yaml(y, df) for df in dotfiles_ed))
self.assertFalse(any(file_in_yaml(y, df) for df in dotfiles_ing))
self.assertTrue(all(file_in_yaml(y, df)
for df in dotfiles_ed))
self.assertFalse(any(file_in_yaml(y, df)
for df in dotfiles_ing))
# testing profiles
profiles = y['profiles'].keys()
@@ -347,7 +349,7 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(a.endswith('ing') for a in actions))
# testing transformations
transformations = y['trans'].keys()
transformations = y['trans_read'].keys()
self.assertTrue(all(t.endswith('ed') for t in transformations))
self.assertFalse(any(t.endswith('ing') for t in transformations))
transformations = y['trans_write'].keys()
@@ -355,7 +357,7 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(t.endswith('ing') for t in transformations))
# testing variables
variables = y['variables'].keys()
variables = self._remove_priv_vars(y['variables'].keys())
self.assertTrue(all(v.endswith('ed') for v in variables))
self.assertFalse(any(v.endswith('ing') for v in variables))
dyn_variables = y['dynvariables'].keys()
@@ -366,8 +368,10 @@ class TestImport(unittest.TestCase):
y = self.load_yaml(importing_path)
# testing dotfiles
self.assertTrue(all(file_in_yaml(y, df) for df in dotfiles_ing))
self.assertFalse(any(file_in_yaml(y, df) for df in dotfiles_ed))
self.assertTrue(all(file_in_yaml(y, df)
for df in dotfiles_ing))
self.assertFalse(any(file_in_yaml(y, df)
for df in dotfiles_ed))
# testing profiles
profiles = y['profiles'].keys()
@@ -387,7 +391,7 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(action.endswith('ed') for action in actions))
# testing transformations
transformations = y['trans'].keys()
transformations = y['trans_read'].keys()
self.assertTrue(all(t.endswith('ing') for t in transformations))
self.assertFalse(any(t.endswith('ed') for t in transformations))
transformations = y['trans_write'].keys()
@@ -395,13 +399,19 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(t.endswith('ed') for t in transformations))
# testing variables
variables = y['variables'].keys()
variables = self._remove_priv_vars(y['variables'].keys())
self.assertTrue(all(v.endswith('ing') for v in variables))
self.assertFalse(any(v.endswith('ed') for v in variables))
dyn_variables = y['dynvariables'].keys()
self.assertTrue(all(dv.endswith('ing') for dv in dyn_variables))
self.assertFalse(any(dv.endswith('ed') for dv in dyn_variables))
def _remove_priv_vars(self, variables_keys):
variables = [v for v in variables_keys if not v.startswith('_')]
if 'profile' in variables:
variables.remove('profile')
return variables
def main():
unittest.main()

View File

@@ -9,7 +9,7 @@ import unittest
from unittest.mock import MagicMock, patch
import filecmp
from dotdrop.config import Cfg
from dotdrop.cfg_aggregator import CfgAggregator as Cfg
from tests.helpers import (clean, create_dir, create_fake_config,
create_random_file, get_string, get_tempdir,
load_options, populate_fake_config)
@@ -47,8 +47,8 @@ exec bspwm
for action in actions:
f.write(' {}: {}\n'.format(action.key, action.action))
f.write('trans:\n')
for action in trans:
f.write(' {}: {}\n'.format(action.key, action.action))
for tr in trans:
f.write(' {}: {}\n'.format(tr.key, tr.action))
f.write('config:\n')
f.write(' backup: true\n')
f.write(' create: true\n')
@@ -64,7 +64,8 @@ exec bspwm
for action in d.actions:
f.write(' - {}\n'.format(action.key))
if d.trans_r:
f.write(' trans: {}\n'.format(d.trans_r.key))
for tr in d.trans_r:
f.write(' trans_read: {}\n'.format(tr.key))
f.write('profiles:\n')
f.write(' {}:\n'.format(profile))
f.write(' dotfiles:\n')
@@ -89,7 +90,7 @@ exec bspwm
f1, c1 = create_random_file(tmp)
dst1 = os.path.join(dst, get_string(6))
d1 = Dotfile(get_string(5), dst1, os.path.basename(f1))
# fake a print
# fake a __str__
self.assertTrue(str(d1) != '')
f2, c2 = create_random_file(tmp)
dst2 = os.path.join(dst, get_string(6))
@@ -165,7 +166,7 @@ exec bspwm
tr = Action('testtrans', 'post', cmd)
f9, c9 = create_random_file(tmp, content=trans1)
dst9 = os.path.join(dst, get_string(6))
d9 = Dotfile(get_string(6), dst9, os.path.basename(f9), trans_r=tr)
d9 = Dotfile(get_string(6), dst9, os.path.basename(f9), trans_r=[tr])
# to test template
f10, _ = create_random_file(tmp, content='{{@@ header() @@}}')
@@ -178,7 +179,7 @@ exec bspwm
dotfiles = [d1, d2, d3, d4, d5, d6, d7, d8, d9, d10, ddot]
self.fake_config(confpath, dotfiles,
profile, tmp, [act1], [tr])
conf = Cfg(confpath)
conf = Cfg(confpath, profile, debug=True)
self.assertTrue(conf is not None)
# install them
@@ -305,7 +306,7 @@ exec bspwm
# create the importing base config file
importing_path = create_fake_config(tmp,
configname='config.yaml',
import_configs=('config-*.yaml',),
import_configs=['config-2.yaml'],
**importing['config'])
# edit the imported config
@@ -485,7 +486,6 @@ exec bspwm
# ensure dst is link
self.assertTrue(os.path.islink(dst))
# ensure dst not directly linked to src
# TODO: maybe check that its actually linked to template folder
self.assertNotEqual(os.path.realpath(dst), src)

131
tests/test_remove.py Normal file
View File

@@ -0,0 +1,131 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2019, deadc0de6
basic unittest for the remove function
"""
import unittest
import os
# local imports
from dotdrop.dotdrop import cmd_remove
from tests.helpers import (clean, create_dir,
create_random_file, load_options,
get_tempdir, yaml_load, yaml_dump)
class TestRemove(unittest.TestCase):
def load_yaml(self, path):
"""Load yaml to dict"""
self.assertTrue(os.path.exists(path))
return yaml_load(path)
def test_remove(self):
"""test the remove command"""
# dotfiles in dotpath
dotdrop_home = get_tempdir()
self.assertTrue(os.path.exists(dotdrop_home))
self.addCleanup(clean, dotdrop_home)
dotfilespath = os.path.join(dotdrop_home, 'dotfiles')
confpath = os.path.join(dotdrop_home, 'config.yaml')
create_dir(dotfilespath)
df1, _ = create_random_file(dotfilespath)
df2, _ = create_random_file(dotfilespath)
df3, _ = create_random_file(dotfilespath)
configdic = {
'config': {
'dotpath': 'dotfiles',
},
'dotfiles': {
'f_test1': {
'src': df1,
'dst': '/dev/null'
},
'f_test2': {
'src': df2,
'dst': '/dev/null'
},
'f_test3': {
'src': df3,
'dst': '/tmp/some-fake-path'
},
},
'profiles': {
'host1': {
'dotfiles': ['f_test1', 'f_test2', 'f_test3'],
},
'host2': {
'dotfiles': ['f_test1'],
},
'host3': {
'dotfiles': ['f_test2'],
},
},
}
yaml_dump(configdic, confpath)
o = load_options(confpath, 'host1')
o.remove_path = ['f_test1']
o.remove_iskey = True
o.debug = True
o.safe = False
# by key
cmd_remove(o)
# ensure file is deleted
self.assertFalse(os.path.exists(df1))
self.assertTrue(os.path.exists(df2))
self.assertTrue(os.path.exists(df3))
# load dict
y = yaml_load(confpath)
# ensure not present
self.assertTrue('f_test1' not in y['dotfiles'])
self.assertTrue('f_test1' not in y['profiles']['host1']['dotfiles'])
self.assertTrue('host2' not in y['profiles'])
# assert rest is intact
self.assertTrue('f_test2' in y['dotfiles'].keys())
self.assertTrue('f_test3' in y['dotfiles'].keys())
self.assertTrue('f_test2' in y['profiles']['host1']['dotfiles'])
self.assertTrue('f_test3' in y['profiles']['host1']['dotfiles'])
self.assertTrue(y['profiles']['host3']['dotfiles'] == ['f_test2'])
o = load_options(confpath, 'host1')
o.remove_path = ['/tmp/some-fake-path']
o.remove_iskey = False
o.debug = True
o.safe = False
# by path
cmd_remove(o)
# ensure file is deleted
self.assertTrue(os.path.exists(df2))
self.assertFalse(os.path.exists(df3))
# load dict
y = yaml_load(confpath)
# ensure not present
self.assertTrue('f_test3' not in y['dotfiles'])
self.assertTrue('f_test3' not in y['profiles']['host1']['dotfiles'])
# assert rest is intact
self.assertTrue('host1' in y['profiles'].keys())
self.assertFalse('host2' in y['profiles'].keys())
self.assertTrue('host3' in y['profiles'].keys())
self.assertTrue(y['profiles']['host1']['dotfiles'] == ['f_test2'])
self.assertTrue(y['profiles']['host3']['dotfiles'] == ['f_test2'])
def main():
unittest.main()
if __name__ == '__main__':
main()

View File

@@ -105,6 +105,7 @@ class TestUpdate(unittest.TestCase):
o = load_options(confpath, profile)
o.safe = False
o.update_showpatch = True
o.debug = True
trans = Transform('trans', 'cp -r {0} {1}')
d3tb = os.path.basename(d3t)
for dotfile in o.dotfiles:

View File

@@ -8,14 +8,13 @@ basic unittest for the config parser
import unittest
from unittest.mock import patch
import os
import yaml
from dotdrop.config import Cfg
from dotdrop.cfg_yaml import CfgYaml as Cfg
from dotdrop.options import Options
from dotdrop.linktypes import LinkTypes
from tests.helpers import (SubsetTestCase, _fake_args, clean,
create_fake_config, create_yaml_keyval, get_tempdir,
populate_fake_config)
populate_fake_config, yaml_load, yaml_dump)
class TestConfig(SubsetTestCase):
@@ -38,17 +37,16 @@ class TestConfig(SubsetTestCase):
dotpath=self.CONFIG_DOTPATH,
backup=self.CONFIG_BACKUP,
create=self.CONFIG_CREATE)
conf = Cfg(confpath)
conf = Cfg(confpath, debug=True)
self.assertTrue(conf is not None)
opts = conf.get_settings()
opts = conf.settings
self.assertTrue(opts is not None)
self.assertTrue(opts != {})
self.assertTrue(opts['backup'] == self.CONFIG_BACKUP)
self.assertTrue(opts['create'] == self.CONFIG_CREATE)
dotpath = os.path.join(tmp, self.CONFIG_DOTPATH)
self.assertTrue(opts['dotpath'] == dotpath)
self.assertTrue(conf._is_valid())
dpath = os.path.basename(opts['dotpath'])
self.assertTrue(dpath == self.CONFIG_DOTPATH)
self.assertTrue(conf.dump() != '')
def test_def_link(self):
@@ -68,8 +66,8 @@ class TestConfig(SubsetTestCase):
'link_children')
self._test_link_import_fail('whatever')
@patch('dotdrop.config.open', create=True)
@patch('dotdrop.config.os.path.exists', create=True)
@patch('dotdrop.cfg_yaml.open', create=True)
@patch('dotdrop.cfg_yaml.os.path.exists', create=True)
def _test_link_import(self, cfgstring, expected,
cliargs, mock_exists, mock_open):
data = '''
@@ -95,12 +93,13 @@ profiles:
args['--profile'] = 'p1'
args['--cfg'] = 'mocked'
args['--link'] = cliargs
args['--verbose'] = True
o = Options(args=args)
self.assertTrue(o.import_link == expected)
@patch('dotdrop.config.open', create=True)
@patch('dotdrop.config.os.path.exists', create=True)
@patch('dotdrop.cfg_yaml.open', create=True)
@patch('dotdrop.cfg_yaml.os.path.exists', create=True)
def _test_link_import_fail(self, value, mock_exists, mock_open):
data = '''
config:
@@ -125,7 +124,7 @@ profiles:
args['--profile'] = 'p1'
args['--cfg'] = 'mocked'
with self.assertRaisesRegex(ValueError, 'config is not valid'):
with self.assertRaises(ValueError):
o = Options(args=args)
print(o.import_link)
@@ -142,8 +141,7 @@ profiles:
create=self.CONFIG_CREATE)
# edit the config
with open(confpath, 'r') as f:
content = yaml.load(f)
content = yaml_load(confpath)
# adding dotfiles
df1key = 'f_vimrc'
@@ -162,43 +160,38 @@ profiles:
}
# save the new config
with open(confpath, 'w') as f:
yaml.safe_dump(content, f, default_flow_style=False,
indent=2)
yaml_dump(content, confpath)
# do the tests
conf = Cfg(confpath)
conf = Cfg(confpath, debug=True)
self.assertTrue(conf is not None)
# test profile
profiles = conf.get_profiles()
profiles = conf.profiles
self.assertTrue(pf1key in profiles)
self.assertTrue(pf2key in profiles)
# test dotfiles
dotfiles = conf._get_dotfiles(pf1key)
self.assertTrue(df1key in [x.key for x in dotfiles])
self.assertTrue(df2key in [x.key for x in dotfiles])
dotfiles = conf._get_dotfiles(pf2key)
self.assertTrue(df1key in [x.key for x in dotfiles])
self.assertFalse(df2key in [x.key for x in dotfiles])
dotfiles = conf.profiles[pf1key]['dotfiles']
self.assertTrue(df1key in dotfiles)
self.assertTrue(df2key in dotfiles)
dotfiles = conf.profiles[pf2key]['dotfiles']
self.assertTrue(df1key in dotfiles)
self.assertFalse(df2key in dotfiles)
# test not existing included profile
# edit the config
with open(confpath, 'r') as f:
content = yaml.load(f)
content = yaml_load(confpath)
content['profiles'] = {
pf1key: {'dotfiles': [df2key], 'include': ['host2']},
pf2key: {'dotfiles': [df1key], 'include': ['host3']}
}
# save the new config
with open(confpath, 'w') as f:
yaml.safe_dump(content, f, default_flow_style=False,
indent=2)
yaml_dump(content, confpath)
# do the tests
conf = Cfg(confpath)
conf = Cfg(confpath, debug=True)
self.assertTrue(conf is not None)
def test_import_configs_merge(self):
@@ -227,22 +220,26 @@ profiles:
vars_ing_file = create_yaml_keyval(vars_ing, tmp)
actions_ed = {
'pre': {
'a_pre_action_ed': 'echo pre 22',
},
'post': {
'a_post_action_ed': 'echo post 22',
},
'a_action_ed': 'echo 22',
'actions': {
'pre': {
'a_pre_action_ed': 'echo pre 22',
},
'post': {
'a_post_action_ed': 'echo post 22',
},
'a_action_ed': 'echo 22',
}
}
actions_ing = {
'pre': {
'a_pre_action_ing': 'echo pre aa',
},
'post': {
'a_post_action_ing': 'echo post aa',
},
'a_action_ing': 'echo aa',
'actions': {
'pre': {
'a_pre_action_ing': 'echo pre aa',
},
'post': {
'a_post_action_ing': 'echo post aa',
},
'a_action_ing': 'echo aa',
}
}
actions_ed_file = create_yaml_keyval(actions_ed, tmp)
actions_ing_file = create_yaml_keyval(actions_ing, tmp)
@@ -328,7 +325,9 @@ profiles:
# create the importing base config file
importing_path = create_fake_config(tmp,
configname=self.CONFIG_NAME,
import_configs=('config-*.yaml',),
import_configs=[
self.CONFIG_NAME_2
],
**importing['config'])
# edit the imported config
@@ -346,23 +345,34 @@ profiles:
})
# do the tests
importing_cfg = Cfg(importing_path)
imported_cfg = Cfg(imported_path)
importing_cfg = Cfg(importing_path, debug=True)
imported_cfg = Cfg(imported_path, debug=True)
self.assertIsNotNone(importing_cfg)
self.assertIsNotNone(imported_cfg)
# test profiles
self.assertIsSubset(imported_cfg.lnk_profiles,
importing_cfg.lnk_profiles)
self.assertIsSubset(imported_cfg.profiles,
importing_cfg.profiles)
# test dotfiles
self.assertIsSubset(imported_cfg.dotfiles, importing_cfg.dotfiles)
# test actions
self.assertIsSubset(imported_cfg.actions['pre'],
importing_cfg.actions['pre'])
self.assertIsSubset(imported_cfg.actions['post'],
importing_cfg.actions['post'])
pre_ed = post_ed = pre_ing = post_ing = {}
for k, v in imported_cfg.actions.items():
kind, _ = v
if kind == 'pre':
pre_ed[k] = v
elif kind == 'post':
post_ed[k] = v
for k, v in importing_cfg.actions.items():
kind, _ = v
if kind == 'pre':
pre_ing[k] = v
elif kind == 'post':
post_ing[k] = v
self.assertIsSubset(pre_ed, pre_ing)
self.assertIsSubset(post_ed, post_ing)
# test transactions
self.assertIsSubset(imported_cfg.trans_r, importing_cfg.trans_r)
@@ -371,18 +381,18 @@ profiles:
# test variables
imported_vars = {
k: v
for k, v in imported_cfg.get_variables(None).items()
for k, v in imported_cfg.variables.items()
if not k.startswith('_')
}
importing_vars = {
k: v
for k, v in importing_cfg.get_variables(None).items()
for k, v in importing_cfg.variables.items()
if not k.startswith('_')
}
self.assertIsSubset(imported_vars, importing_vars)
# test prodots
self.assertIsSubset(imported_cfg.prodots, importing_cfg.prodots)
self.assertIsSubset(imported_cfg.profiles, importing_cfg.profiles)
def test_import_configs_override(self):
"""Test import_configs when some config keys overlap."""
@@ -410,22 +420,26 @@ profiles:
vars_ing_file = create_yaml_keyval(vars_ing, tmp)
actions_ed = {
'pre': {
'a_pre_action': 'echo pre 22',
},
'post': {
'a_post_action': 'echo post 22',
},
'a_action': 'echo 22',
'actions': {
'pre': {
'a_pre_action': 'echo pre 22',
},
'post': {
'a_post_action': 'echo post 22',
},
'a_action': 'echo 22',
}
}
actions_ing = {
'pre': {
'a_pre_action': 'echo pre aa',
},
'post': {
'a_post_action': 'echo post aa',
},
'a_action': 'echo aa',
'actions': {
'pre': {
'a_pre_action': 'echo pre aa',
},
'post': {
'a_post_action': 'echo post aa',
},
'a_action': 'echo aa',
}
}
actions_ed_file = create_yaml_keyval(actions_ed, tmp)
actions_ing_file = create_yaml_keyval(actions_ing, tmp)
@@ -536,14 +550,14 @@ profiles:
})
# do the tests
importing_cfg = Cfg(importing_path)
imported_cfg = Cfg(imported_path)
importing_cfg = Cfg(importing_path, debug=True)
imported_cfg = Cfg(imported_path, debug=True)
self.assertIsNotNone(importing_cfg)
self.assertIsNotNone(imported_cfg)
# test profiles
self.assertIsSubset(imported_cfg.lnk_profiles,
importing_cfg.lnk_profiles)
self.assertIsSubset(imported_cfg.profiles,
importing_cfg.profiles)
# test dotfiles
self.assertEqual(importing_cfg.dotfiles['f_vimrc'],
@@ -553,14 +567,9 @@ profiles:
# test actions
self.assertFalse(any(
(imported_cfg.actions['pre'][key]
== importing_cfg.actions['pre'][key])
for key in imported_cfg.actions['pre']
))
self.assertFalse(any(
(imported_cfg.actions['post'][key]
== importing_cfg.actions['post'][key])
for key in imported_cfg.actions['post']
(imported_cfg.actions[key]
== importing_cfg.actions[key])
for key in imported_cfg.actions
))
# test transactions
@@ -574,20 +583,20 @@ profiles:
))
# test variables
imported_vars = imported_cfg.get_variables(None)
imported_vars = imported_cfg.variables
self.assertFalse(any(
imported_vars[k] == v
for k, v in importing_cfg.get_variables(None).items()
for k, v in importing_cfg.variables.items()
if not k.startswith('_')
))
# test prodots
self.assertEqual(imported_cfg.prodots['host1'],
importing_cfg.prodots['host1'])
self.assertNotEqual(imported_cfg.prodots['host2'],
importing_cfg.prodots['host2'])
self.assertTrue(set(imported_cfg.prodots['host1'])
< set(importing_cfg.prodots['host2']))
# test profiles dotfiles
self.assertEqual(imported_cfg.profiles['host1']['dotfiles'],
importing_cfg.profiles['host1']['dotfiles'])
self.assertNotEqual(imported_cfg.profiles['host2']['dotfiles'],
importing_cfg.profiles['host2']['dotfiles'])
self.assertTrue(set(imported_cfg.profiles['host1']['dotfiles'])
< set(importing_cfg.profiles['host2']['dotfiles']))
def main():