1
0
mirror of https://github.com/deadc0de6/dotdrop.git synced 2026-02-04 12:46:44 +00:00

Merge branch 'master' into clear-on-install

This commit is contained in:
deadc0de
2023-10-22 14:46:02 +02:00
committed by GitHub
34 changed files with 1311 additions and 328 deletions

View File

@@ -4,7 +4,7 @@ The **config** entry (mandatory) contains global settings.
Entry | Description | Default
-------- | ------------- | ------------
`backup` | Create a backup of the dotfile in case it differs from the one that will be installed by dotdrop | true
`backup` | Create a backup of the existing destination; see [backup entry](config-config.md#backup-entry)) | true
`banner` | Display the banner | true
`check_version` | Check if a new version of dotdrop is available on github | false
`chmod_on_import` | Always add a chmod entry on newly imported dotfiles (see `--preserve-mode`) | false
@@ -212,4 +212,16 @@ profiles:
hostname:
dotfiles:
- f_vimrc
```
```
## backup entry
When set to `true`, existing files that would be replaced
by a dotdrop `install`, are backed up with the
extension `.dotdropbak` if their content differ.
Note:
* directories will **not** be backed up, only files
* when using a different `link` value than `nolink` with directories,
the files under the directory will **not** be backed up
(See [Symlinking dotfiles](config-file.md#symlinking-dotfiles)),

View File

@@ -14,11 +14,13 @@ Entry | Description
`ignoreempty` | If true, an empty template will not be deployed (defaults to the value of `ignoreempty`)
`instignore` | List of patterns to ignore when installing (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns))
`template` | If false, disable templating for this dotfile (defaults to the value of `template_dotfile_default`)
`trans_read` | Transformation key to apply when installing this dotfile (must be defined in the **trans_read** entry below; see [transformations](config-transformations.md))
`trans_write` | Transformation key to apply when updating this dotfile (must be defined in the **trans_write** entry below; see [transformations](config-transformations.md))
`trans_install` | Transformation key to apply when installing this dotfile (must be defined in the **trans_install** entry below; see [transformations](config-transformations.md))
`trans_update` | Transformation key to apply when updating this dotfile (must be defined in the **trans_update** entry below; see [transformations](config-transformations.md))
`upignore` | List of patterns to ignore when updating (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns))
<s>link_children</s> | Replaced by `link: link_children`
<s>trans</s> | Replaced by `trans_read`
<s>trans</s> | Replaced by `trans_install`
<s>trans_read</s> | Replaced by `trans_install`
<s>trans_write</s> | Replaced by `trans_update`
```yaml
<dotfile-key-name>:
@@ -37,8 +39,8 @@ Entry | Description
- <action-key>
template: (true|false)
chmod: '<file-permissions>'
trans_read: <transformation-key>
trans_write: <transformation-key>
trans_install: <transformation-key>
trans_update: <transformation-key>
```
## Dotfile actions

View File

@@ -91,17 +91,17 @@ dotfiles:
dst: ~/dir
chmod: 744
f_preserve:
src: preserve
dst: ~/preserve
src: pfile
dst: ~/pfile
chmod: preserve
```
The `chmod` value defines the file permissions in octal notation to apply on dotfiles. If undefined
The `chmod` value defines the file permissions in octal notation to apply to the dotfile. If undefined
new files will get the system default permissions (see `umask`, `777-<umask>` for directories and
`666-<umask>` for files).
The special keyword `preserve` allows to ensure that if the dotfiles already exists
on the filesystem, it is not altered during `install` and the `chmod` value won't
on the filesystem, its permission is not altered during `install` and the `chmod` config value won't
be changed during `update`.
On `import`, the following rules are applied:

View File

@@ -14,14 +14,14 @@ For examples of transformation uses, see:
There are two types of transformations available:
* **Read transformations**: used to transform dotfiles before they are installed ([config](config-config.md) key `trans_read`)
* **Install transformations**: used to transform dotfiles before they are installed ([config](config-config.md) key `trans_install`)
* Used for commands `install` and `compare`
* They have two mandatory arguments:
* **{0}** will be replaced with the dotfile to process
* **{1}** will be replaced with a temporary file to store the result of the transformation
* This Happens **before** the dotfile is templated (see [templating](../template/templating.md))
* **Write transformations**: used to transform files before updating a dotfile ([config](config-config.md) key `trans_write`)
* **Update/Import transformations**: used to transform files before updating/importing a dotfile ([config](config-config.md) key `trans_update`)
* Used for command `update` and `import`
* They have two mandatory arguments:
* **{0}** will be replaced with the file path to update the dotfile with
@@ -36,13 +36,13 @@ Transformations also support additional positional arguments that must start fro
For example:
```yaml
trans_read:
trans_install:
targ: echo "$(basename {0}); {{@@ _dotfile_key @@}}; {2}; {3}" > {1}
dotfiles:
f_abc:
dst: /tmp/abc
src: abc
trans_read: targ "{{@@ profile @@}}" lastarg
trans_install: targ "{{@@ profile @@}}" lastarg
profiles:
p1:
dotfiles:
@@ -51,21 +51,21 @@ profiles:
will result in `abc; f_abc; p1; lastarg`.
## trans_read entry
## trans_install entry
The **trans_read** entry (optional) contains a transformations mapping (See [transformations](config-transformations.md)).
The **trans_install** entry (optional) contains a transformations mapping (See [transformations](config-transformations.md)).
```yaml
trans_read:
trans_install:
<trans-key>: <command-to-execute>
```
## trans_write entry
## trans_update entry
The **trans_write** entry (optional) contains a write transformations mapping (See [transformations](config-transformations.md)).
The **trans_update** entry (optional) contains a write transformations mapping (See [transformations](config-transformations.md)).
```yaml
trans_write:
trans_update:
<trans-key>: <command-to-execute>
```
@@ -77,10 +77,10 @@ and [template variables](../template/template-variables.md#template-variables)).
A very dumb example:
```yaml
trans_read:
trans_install:
r_echo_abs_src: echo "{0}: {{@@ _dotfile_abs_src @@}}" > {1}
r_echo_var: echo "{0}: {{@@ r_var @@}}" > {1}
trans_write:
trans_update:
w_echo_key: echo "{0}: {{@@ _dotfile_key @@}}" > {1}
w_echo_var: echo "{0}: {{@@ w_var @@}}" > {1}
variables:
@@ -90,11 +90,11 @@ dotfiles:
f_abc:
dst: ${tmpd}/abc
src: abc
trans_read: r_echo_abs_src
trans_write: w_echo_key
trans_install: r_echo_abs_src
trans_update: w_echo_key
f_def:
dst: ${tmpd}/def
src: def
trans_read: r_echo_var
trans_write: w_echo_var
trans_install: r_echo_var
trans_update: w_echo_var
```

View File

@@ -37,9 +37,9 @@ First you need to define the encryption/decryption methods, for example
```yaml
variables:
keyid: "11223344"
trans_read:
trans_install:
_decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}"
```
@@ -60,17 +60,17 @@ Using GPG keys:
```yaml
variables:
keyid: "11223344"
trans_read:
trans_install:
_decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}"
```
Passphrase is stored in an environment variable:
```yaml
trans_read:
trans_install:
_decrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
```
@@ -78,9 +78,9 @@ Passphrase is stored as a variable:
```yaml
variables:
gpg_password: "some password"
trans_read:
trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
```
@@ -88,9 +88,9 @@ Passphrase is retrieved using a script:
```yaml
dynvariables:
gpg_password: "./get-password.sh"
trans_read:
trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
```
@@ -100,9 +100,9 @@ variables:
gpg_password_file: "/tmp/the-password"
dynvariables:
gpg_password: "cat {{@@ gpg_password_file @@}}"
trans_read:
trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write:
trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
```

View File

@@ -1,13 +1,13 @@
# Handle compressed directories
This is an example of how to use transformations (`trans_read` and `trans_write`) to store
This is an example of how to use transformations (`trans_install` and `trans_update`) to store
compressed directories and deploy them with dotdrop.
Start by defining the transformations:
```yaml
trans_read:
trans_install:
uncompress: "mkdir -p {1} && tar -xf {0} -C {1}"
trans_write:
trans_update:
compress: "tar -cf {1} -C {0} ."
```

15
docs/usage.md vendored
View File

@@ -235,6 +235,21 @@ dotdrop. It will:
For more options, see the usage with `dotdrop --help`.
## Uninstall dotfiles
The `uninstall` command removes dotfiles installed by dotdrop
```bash
$ dotdrop uninstall
```
It will remove the installed dotfiles related to the provided key
(or all dotfiles if not provided) of the selected profile.
If a backup exists ([backup entry](config/config-config.md#backup-entry)),
the file will be restored.
For more options, see the usage with `dotdrop --help`.
## Concurrency
The command line switch `-w`/`--workers`, if set to a value greater than one, enables the use

View File

@@ -69,23 +69,23 @@ class CfgAggregator:
return self.cfgyaml.del_dotfile_from_profile(dotfile.key, profile.key)
def new_dotfile(self, src, dst, link, chmod=None,
trans_read=None, trans_write=None):
trans_install=None, trans_update=None):
"""
import a new dotfile
@src: path in dotpath
@dst: path in FS
@link: LinkType
@chmod: file permission
@trans_read: read transformation
@trans_write: write transformation
@trans_install: read transformation
@trans_update: write transformation
"""
dst = self.path_to_dotfile_dst(dst)
dotfile = self.get_dotfile_by_src_dst(src, dst)
if not dotfile:
# add the dotfile
dotfile = self._create_new_dotfile(src, dst, link, chmod=chmod,
trans_read=trans_read,
trans_write=trans_write)
trans_install=trans_install,
trans_update=trans_update)
if not dotfile:
return False
@@ -237,25 +237,25 @@ class CfgAggregator:
########################################################
def _create_new_dotfile(self, src, dst, link, chmod=None,
trans_read=None, trans_write=None):
trans_install=None, trans_update=None):
"""create a new dotfile"""
# get a new dotfile with a unique key
key = self._get_new_dotfile_key(dst)
self.log.dbg(f'new dotfile key: {key}')
# add the dotfile
trans_r_key = trans_w_key = None
if trans_read:
trans_r_key = trans_read.key
if trans_write:
trans_w_key = trans_write.key
trans_install_key = trans_update_key = None
if trans_install:
trans_install_key = trans_install.key
if trans_update:
trans_update_key = trans_update.key
if not self.cfgyaml.add_dotfile(key, src, dst, link,
chmod=chmod,
trans_r_key=trans_r_key,
trans_w_key=trans_w_key):
trans_install_key=trans_install_key,
trans_update_key=trans_update_key):
return None
return Dotfile(key, dst, src,
trans_r=trans_read,
trans_w=trans_write)
trans_install=trans_install,
trans_update=trans_update)
########################################################
# parsing
@@ -297,15 +297,15 @@ class CfgAggregator:
self.actions = Action.parse_dict(self.cfgyaml.actions)
debug_list('actions', self.actions, self.debug)
# trans_r
self.log.dbg('parsing trans_r')
self.trans_r = Transform.parse_dict(self.cfgyaml.trans_r)
debug_list('trans_r', self.trans_r, self.debug)
# trans_install
self.log.dbg('parsing trans_install')
self.trans_install = Transform.parse_dict(self.cfgyaml.trans_install)
debug_list('trans_install', self.trans_install, self.debug)
# trans_w
self.log.dbg('parsing trans_w')
self.trans_w = Transform.parse_dict(self.cfgyaml.trans_w)
debug_list('trans_w', self.trans_w, self.debug)
# trans_update
self.log.dbg('parsing trans_update')
self.trans_update = Transform.parse_dict(self.cfgyaml.trans_update)
debug_list('trans_update', self.trans_update, self.debug)
# variables
self.log.dbg('parsing variables')
@@ -334,14 +334,17 @@ class CfgAggregator:
msg = f'default actions: {self.settings.default_actions}'
self.log.dbg(msg)
# patch trans_w/trans_r in dotfiles
# patch trans_install in dotfiles
trans_inst_args = self._get_trans_update_args(self.get_trans_install)
self._patch_keys_to_objs(self.dotfiles,
"trans_r",
self._get_trans_w_args(self.get_trans_r),
CfgYaml.key_trans_install,
trans_inst_args,
islist=False)
# patch trans_update in dotfiles
trans_update_args = self._get_trans_update_args(self.get_trans_update)
self._patch_keys_to_objs(self.dotfiles,
"trans_w",
self._get_trans_w_args(self.get_trans_w),
CfgYaml.key_trans_update,
trans_update_args,
islist=False)
self.log.dbg('done parsing cfgyaml into cfg_aggregator')
@@ -542,7 +545,7 @@ class CfgAggregator:
action = self._get_action(key)
return action
def _get_trans_w_args(self, getter):
def _get_trans_update_args(self, getter):
"""return transformation by key with the arguments"""
def getit(key):
fields = shlex.split(key)
@@ -557,16 +560,16 @@ class CfgAggregator:
return trans
return getit
def get_trans_r(self, key):
"""return the trans_r with this key"""
def get_trans_install(self, key):
"""return the trans_install with this key"""
try:
return next(x for x in self.trans_r if x.key == key)
return next(x for x in self.trans_install if x.key == key)
except StopIteration:
return None
def get_trans_w(self, key):
"""return the trans_w with this key"""
def get_trans_update(self, key):
"""return the trans_update with this key"""
try:
return next(x for x in self.trans_w if x.key == key)
return next(x for x in self.trans_update if x.key == key)
except StopIteration:
return None

View File

@@ -11,8 +11,8 @@ the upper layer:
* self.dotfiles
* self.profiles
* self.actions
* self.trans_r
* self.trans_w
* self.trans_install
* self.trans_update
* self.variables
Additionally a few methods are exported.
@@ -50,9 +50,11 @@ class CfgYaml:
key_dotfiles = 'dotfiles'
key_profiles = 'profiles'
key_actions = 'actions'
old_key_trans_r = 'trans'
key_trans_r = 'trans_read'
key_trans_w = 'trans_write'
old_key_trans = 'trans'
old_key_trans_r = 'trans_read'
old_key_trans_w = 'trans_write'
key_trans_install = 'trans_install'
key_trans_update = 'trans_update'
key_variables = 'variables'
key_dvariables = 'dynvariables'
key_uvariables = 'uservariables'
@@ -146,8 +148,8 @@ class CfgYaml:
self.dotfiles = {}
self.profiles = {}
self.actions = {}
self.trans_r = {}
self.trans_w = {}
self.trans_install = {}
self.trans_update = {}
self.variables = {}
if not os.path.exists(self._path):
@@ -248,10 +250,10 @@ class CfgYaml:
self.dotfiles = self._parse_blk_dotfiles(self._yaml_dict)
# parse the "actions" block
self.actions = self._parse_blk_actions(self._yaml_dict)
# parse the "trans_r" block
self.trans_r = self._parse_blk_trans_r(self._yaml_dict)
# parse the "trans_w" block
self.trans_w = self._parse_blk_trans_w(self._yaml_dict)
# parse the "trans_install" block
self.trans_install = self._parse_blk_trans_install(self._yaml_dict)
# parse the "trans_update" block
self.trans_update = self._parse_blk_trans_update(self._yaml_dict)
##################################################
# import elements
@@ -427,7 +429,7 @@ class CfgYaml:
return True
def add_dotfile(self, key, src, dst, link, chmod=None,
trans_r_key=None, trans_w_key=None):
trans_install_key=None, trans_update_key=None):
"""add a new dotfile"""
if key in self.dotfiles.keys():
return False
@@ -438,8 +440,8 @@ class CfgYaml:
self._dbg(f'new dotfile link: {link}')
if chmod:
self._dbg(f'new dotfile chmod: {chmod:o}')
self._dbg(f'new dotfile trans_r: {trans_r_key}')
self._dbg(f'new dotfile trans_w: {trans_w_key}')
self._dbg(f'new dotfile trans_install: {trans_install_key}')
self._dbg(f'new dotfile trans_update: {trans_update_key}')
# create the dotfile dict
df_dict = {
@@ -456,11 +458,11 @@ class CfgYaml:
if chmod:
df_dict[self.key_dotfile_chmod] = str(format(chmod, 'o'))
# trans_r/trans_w
if trans_r_key:
df_dict[self.key_trans_r] = str(trans_r_key)
if trans_w_key:
df_dict[self.key_trans_w] = str(trans_w_key)
# trans_install/trans_update
if trans_install_key:
df_dict[self.key_trans_install] = str(trans_install_key)
if trans_update_key:
df_dict[self.key_trans_update] = str(trans_update_key)
if self._debug:
self._dbg(f'dotfile dict: {df_dict}')
@@ -618,30 +620,25 @@ class CfgYaml:
self._debug_dict('actions block', actions)
return actions
def _parse_blk_trans_r(self, dic):
"""parse the "trans_r" block"""
key = self.key_trans_r
if self.old_key_trans_r in dic:
msg = '\"trans\" is deprecated, please use \"trans_read\"'
self._log.warn(msg)
dic[self.key_trans_r] = dic[self.old_key_trans_r]
del dic[self.old_key_trans_r]
trans_r = self._get_entry(dic, key, mandatory=False)
if trans_r:
trans_r = trans_r.copy()
def _parse_blk_trans_install(self, dic):
"""parse the "trans_install" block"""
trans_install = self._get_entry(dic, self.key_trans_install,
mandatory=False)
if trans_install:
trans_install = trans_install.copy()
if self._debug:
self._debug_dict('trans_r block', trans_r)
return trans_r
self._debug_dict('trans_install block', trans_install)
return trans_install
def _parse_blk_trans_w(self, dic):
"""parse the "trans_w" block"""
trans_w = self._get_entry(dic, self.key_trans_w,
mandatory=False)
if trans_w:
trans_w = trans_w.copy()
def _parse_blk_trans_update(self, dic):
"""parse the "trans_update" block"""
trans_update = self._get_entry(dic, self.key_trans_update,
mandatory=False)
if trans_update:
trans_update = trans_update.copy()
if self._debug:
self._debug_dict('trans_w block', trans_w)
return trans_w
self._debug_dict('trans_update block', trans_update)
return trans_update
def _parse_blk_variables(self, dic):
"""parse the "variables" block"""
@@ -817,6 +814,7 @@ class CfgYaml:
if not dotfiles:
return dotfiles
new = {}
for k, val in dotfiles.items():
if self.key_dotfile_src not in val:
# add 'src' as key' if not present
@@ -825,14 +823,6 @@ class CfgYaml:
else:
new[k] = val
if self.old_key_trans_r in val:
# fix deprecated trans key
msg = f'{k} \"trans\" is deprecated, please use \"trans_read\"'
self._log.warn(msg)
val[self.key_trans_r] = val[self.old_key_trans_r]
del val[self.old_key_trans_r]
new[k] = val
if self.key_dotfile_link not in val:
# apply link value if undefined
value = self.settings[self.key_settings_link_dotfile_default]
@@ -1108,8 +1098,10 @@ class CfgYaml:
self.profiles = self._merge_dict(self.profiles, sub.profiles,
deep=True)
self.actions = self._merge_dict(self.actions, sub.actions)
self.trans_r = self._merge_dict(self.trans_r, sub.trans_r)
self.trans_w = self._merge_dict(self.trans_w, sub.trans_w)
self.trans_install = self._merge_dict(self.trans_install,
sub.trans_install)
self.trans_update = self._merge_dict(self.trans_update,
sub.trans_update)
self._clear_profile_vars(sub.variables)
self.imported_configs.append(path)
@@ -1189,6 +1181,54 @@ class CfgYaml:
return
self._fix_deprecated_link_by_default(yamldict)
self._fix_deprecated_dotfile_link(yamldict)
self._fix_deprecated_trans(yamldict)
def _fix_deprecated_trans_in_dict(self, yamldic):
# trans -> trans_install
old_key = self.old_key_trans
new_key = self.key_trans_install
if old_key in yamldic:
yamldic[old_key] = yamldic[new_key]
del yamldic[old_key]
msg = f'deprecated \"{old_key}\", '
msg += f', updated to {new_key}\"'
self._log.warn(msg)
self._dirty = True
self._dirty_deprecated = True
# trans_read -> trans_install
old_key = self.old_key_trans_r
new_key = self.key_trans_install
if old_key in yamldic:
yamldic[new_key] = yamldic[old_key]
del yamldic[old_key]
warn = f'deprecated \"{old_key}\"'
warn += f', updated to \"{new_key}\"'
self._log.warn(warn)
self._dirty = True
self._dirty_deprecated = True
# trans_write -> trans_update
old_key = self.old_key_trans_w
new_key = self.key_trans_update
if old_key in yamldic:
yamldic[new_key] = yamldic[old_key]
del yamldic[old_key]
warn = f'deprecated \"{old_key}\"'
warn += f', updated to \"{new_key}\"'
self._log.warn(warn)
self._dirty = True
self._dirty_deprecated = True
def _fix_deprecated_trans(self, yamldict):
"""fix deprecated trans key"""
# top ones
self._fix_deprecated_trans_in_dict(yamldict)
# dotfiles ones
if self.key_dotfiles in yamldict and yamldict[self.key_dotfiles]:
config = yamldict[self.key_dotfiles]
for _, val in config.items():
self._fix_deprecated_trans_in_dict(val)
def _fix_deprecated_link_by_default(self, yamldict):
"""fix deprecated link_by_default"""
@@ -1786,8 +1826,8 @@ class CfgYaml:
self._debug_dict('entry dotfiles', self.dotfiles)
self._debug_dict('entry profiles', self.profiles)
self._debug_dict('entry actions', self.actions)
self._debug_dict('entry trans_r', self.trans_r)
self._debug_dict('entry trans_w', self.trans_w)
self._debug_dict('entry trans_install', self.trans_install)
self._debug_dict('entry trans_update', self.trans_update)
self._debug_dict('entry variables', self.variables)
def _debug_dict(self, title, elems):

View File

@@ -16,6 +16,7 @@ from dotdrop.options import Options
from dotdrop.logger import Logger
from dotdrop.templategen import Templategen
from dotdrop.installer import Installer
from dotdrop.uninstaller import Uninstaller
from dotdrop.updater import Updater
from dotdrop.comparator import Comparator
from dotdrop.importer import Importer
@@ -120,9 +121,10 @@ def _dotfile_compare(opts, dotfile, tmp):
# apply transformation
tmpsrc = None
if dotfile.trans_r:
if dotfile.trans_install:
LOG.dbg('applying transformation before comparing')
tmpsrc = apply_trans(opts.dotpath, dotfile, templ, debug=opts.debug)
tmpsrc = apply_install_trans(opts.dotpath, dotfile,
templ, debug=opts.debug)
if not tmpsrc:
# could not apply trans
return False
@@ -238,8 +240,9 @@ def _dotfile_install(opts, dotfile, tmpdir=None):
# nolink
src = dotfile.src
tmp = None
if dotfile.trans_r:
tmp = apply_trans(opts.dotpath, dotfile, templ, debug=opts.debug)
if dotfile.trans_install:
tmp = apply_install_trans(opts.dotpath, dotfile,
templ, debug=opts.debug)
if not tmp:
return False, dotfile.key, None
src = tmp
@@ -538,8 +541,8 @@ def cmd_importer(opts):
import_as=opts.import_as,
import_link=opts.import_link,
import_mode=opts.import_mode,
import_transw=opts.import_transw,
import_transr=opts.import_transr)
trans_install=opts.import_trans_install,
trans_update=opts.import_trans_update)
if tmpret < 0:
ret = False
elif tmpret > 0:
@@ -618,6 +621,47 @@ def cmd_detail(opts):
LOG.log('')
def cmd_uninstall(opts):
"""uninstall"""
dotfiles = opts.dotfiles
keys = opts.uninstall_key
if keys:
# update only specific keys for this profile
dotfiles = []
for key in uniq_list(keys):
dotfile = opts.conf.get_dotfile(key)
if dotfile:
dotfiles.append(dotfile)
if not dotfiles:
msg = f'no dotfile to uninstall for this profile (\"{opts.profile}\")'
LOG.warn(msg)
return False
if opts.debug:
lfs = [k.key for k in dotfiles]
LOG.dbg(f'dotfiles registered for uninstall: {lfs}')
uninst = Uninstaller(base=opts.dotpath,
workdir=opts.workdir,
dry=opts.dry,
safe=opts.safe,
debug=opts.debug,
backup_suffix=opts.install_backup_suffix)
uninstalled = 0
for dotf in dotfiles:
res, msg = uninst.uninstall(dotf.src,
dotf.dst,
dotf.link)
if not res:
LOG.err(msg)
continue
uninstalled += 1
LOG.log(f'\n{uninstalled} dotfile(s) uninstalled.')
return True
def cmd_remove(opts):
"""remove dotfile from dotpath and from config"""
paths = opts.remove_path
@@ -773,19 +817,20 @@ def _select(selections, dotfiles):
return selected
def apply_trans(dotpath, dotfile, templater, debug=False):
def apply_install_trans(dotpath, dotfile, templater, debug=False):
"""
apply the read transformation to the dotfile
apply the install transformation to the dotfile
return None if fails and new source if succeed
"""
src = dotfile.src
new_src = f'{src}.{TRANS_SUFFIX}'
trans = dotfile.trans_r
LOG.dbg(f'executing transformation: {trans}')
trans = dotfile.trans_install
LOG.dbg(f'executing install transformation: {trans}')
srcpath = os.path.join(dotpath, src)
temp = os.path.join(dotpath, new_src)
if not trans.transform(srcpath, temp, templater=templater, debug=debug):
msg = f'transformation \"{trans.key}\" failed for {dotfile.key}'
msg = f'install transformation \"{trans.key}\"'
msg += f'failed for {dotfile.key}'
LOG.err(msg)
if new_src and os.path.exists(new_src):
removepath(new_src, LOG)
@@ -854,6 +899,12 @@ def _exec_command(opts):
LOG.dbg(f'running cmd: {command}')
cmd_remove(opts)
elif opts.cmd_uninstall:
# uninstall dotfile
command = 'uninstall'
LOG.dbg(f'running cmd: {command}')
cmd_uninstall(opts)
except UndefinedException as exc:
LOG.err(exc)
ret = False

View File

@@ -14,12 +14,12 @@ class Dotfile(DictParser):
"""Represent a dotfile."""
# dotfile keys
key_noempty = 'ignoreempty'
key_trans_r = 'trans_read'
key_trans_w = 'trans_write'
key_trans_install = 'trans_install'
key_trans_update = 'trans_update'
key_template = 'template'
def __init__(self, key, dst, src,
actions=None, trans_r=None, trans_w=None,
actions=None, trans_install=None, trans_update=None,
link=LinkTypes.NOLINK, noempty=False,
cmpignore=None, upignore=None,
instignore=None, template=True, chmod=None,
@@ -30,8 +30,8 @@ class Dotfile(DictParser):
@dst: dotfile dst (in user's home usually)
@src: dotfile src (in dotpath)
@actions: dictionary of actions to execute for this dotfile
@trans_r: transformation to change dotfile before it is installed
@trans_w: transformation to change dotfile before updating it
@trans_install: transformation to change dotfile before it is installed
@trans_update: transformation to change dotfile before updating it
@link: link behavior
@noempty: ignore empty template if True
@upignore: patterns to ignore when updating
@@ -46,8 +46,8 @@ class Dotfile(DictParser):
self.link = LinkTypes.get(link)
self.noempty = noempty
self.src = src
self.trans_r = trans_r
self.trans_w = trans_w
self.trans_install = trans_install
self.trans_update = trans_update
self.upignore = upignore or []
self.cmpignore = cmpignore or []
self.instignore = instignore or []
@@ -57,14 +57,14 @@ class Dotfile(DictParser):
if self.link != LinkTypes.NOLINK and \
(
(trans_r and len(trans_r) > 0) or
(trans_w and len(trans_w) > 0)
(trans_install and len(trans_install) > 0) or
(trans_update and len(trans_update) > 0)
):
msg = f'[{key}] transformations disabled'
msg += ' because dotfile is linked'
self.log.warn(msg)
self.trans_r = []
self.trans_w = []
self.trans_install = []
self.trans_update = []
def get_dotfile_variables(self):
"""return this dotfile specific variables"""
@@ -83,25 +83,21 @@ class Dotfile(DictParser):
"""return all 'post' actions"""
return [a for a in self.actions if a.kind == Action.post]
def get_trans_r(self):
"""return trans_r object"""
return self.trans_r
def get_trans_install(self):
"""return trans_install object"""
return self.trans_install
def get_trans_w(self):
"""return trans_w object"""
return self.trans_w
def get_trans_update(self):
"""return trans_update object"""
return self.trans_update
@classmethod
def _adjust_yaml_keys(cls, value):
"""patch dict"""
value['noempty'] = value.get(cls.key_noempty, False)
value['trans_r'] = value.get(cls.key_trans_r)
value['trans_w'] = value.get(cls.key_trans_w)
value['template'] = value.get(cls.key_template, True)
# remove old entries
value.pop(cls.key_noempty, None)
value.pop(cls.key_trans_r, None)
value.pop(cls.key_trans_w, None)
return value
def __eq__(self, other):
@@ -116,6 +112,10 @@ class Dotfile(DictParser):
msg += f', dst:\"{self.dst}\"'
msg += f', link:\"{self.link}\"'
msg += f', template:{self.template}'
if self.trans_install:
msg += f', trans_install:{self.trans_install}'
if self.trans_update:
msg += f', trans_update:{self.trans_update}'
if self.chmod:
if isinstance(self.chmod, int) or len(self.chmod) == 3:
msg += f', chmod:{self.chmod:o}'
@@ -149,13 +149,13 @@ class Dotfile(DictParser):
for act in some:
out += f'\n{2*indent}- {act}'
out += f'\n{indent}trans_r:'
some = self.get_trans_r()
out += f'\n{indent}trans_install:'
some = self.get_trans_install()
if some:
out += f'\n{2*indent}- {some}'
out += f'\n{indent}trans_w:'
some = self.get_trans_w()
out += f'\n{indent}trans_update:'
some = self.get_trans_update()
if some:
out += f'\n{2*indent}- {some}'
return out

View File

@@ -75,8 +75,8 @@ class Importer:
def import_path(self, path, import_as=None,
import_link=LinkTypes.NOLINK,
import_mode=False,
import_transw="",
import_transr=""):
trans_install="",
trans_update=""):
"""
import a dotfile pointed by path
returns:
@@ -90,24 +90,25 @@ class Importer:
self.log.err(f'\"{path}\" does not exist, ignored!')
return -1
# check transw if any
trans_write = None
trans_read = None
if import_transw:
trans_write = self.conf.get_trans_w(import_transw)
if import_transr:
trans_read = self.conf.get_trans_r(import_transr)
# check trans_update if any
tinstall = None
tupdate = None
if trans_install:
tinstall = self.conf.get_trans_install(trans_install)
if trans_update:
tupdate = self.conf.get_trans_update(trans_update)
return self._import(path, import_as=import_as,
import_link=import_link,
import_mode=import_mode,
trans_write=trans_write,
trans_read=trans_read)
trans_update=tupdate,
trans_install=tinstall)
def _import(self, path, import_as=None,
import_link=LinkTypes.NOLINK,
import_mode=False,
trans_write=None, trans_read=None):
trans_install=None,
trans_update=None):
"""
import path
returns:
@@ -162,17 +163,18 @@ class Importer:
self.log.dbg(f'import dotfile: src:{src} dst:{dst}')
if not self._import_to_dotpath(src, dst, trans_write=trans_write):
if not self._import_to_dotpath(src, dst, trans_update=trans_update):
return -1
return self._import_in_config(path, src, dst, perm, linktype,
import_mode,
trans_w=trans_write,
trans_r=trans_read)
trans_update=trans_update,
trans_install=trans_install)
def _import_in_config(self, path, src, dst, perm,
linktype, import_mode,
trans_r=None, trans_w=None):
trans_install=None,
trans_update=None):
"""
import path
returns:
@@ -190,8 +192,8 @@ class Importer:
# add file to config file
retconf = self.conf.new_dotfile(src, dst, linktype, chmod=chmod,
trans_read=trans_r,
trans_write=trans_w)
trans_install=trans_install,
trans_update=trans_update)
if not retconf:
self.log.warn(f'\"{path}\" ignored during import')
return 0
@@ -222,7 +224,7 @@ class Importer:
self.log.dbg('will overwrite existing file')
return True
def _import_to_dotpath(self, in_dotpath, in_fs, trans_write=None):
def _import_to_dotpath(self, in_dotpath, in_fs, trans_update=None):
"""
prepare hierarchy for dotfile in dotpath and copy file
"""
@@ -237,8 +239,8 @@ class Importer:
self.log.dry(f'would copy {in_fs} to {srcf}')
return True
# apply trans_w
in_fs = self._apply_trans_w(in_fs, trans_write)
# apply trans_update
in_fs = self._apply_trans_update(in_fs, trans_update)
if not in_fs:
# transformation failed
return False
@@ -290,7 +292,7 @@ class Importer:
return True
return False
def _apply_trans_w(self, path, trans):
def _apply_trans_update(self, path, trans):
"""
apply transformation to path on filesystem)
returns

View File

@@ -138,6 +138,12 @@ class Installer:
actionexec=actionexec,
noempty=noempty, ignore=ignore,
is_template=is_template)
ret, err = self._copy_dir(templater, src, dst,
actionexec=actionexec,
noempty=noempty, ignore=ignore,
is_template=is_template,
chmod=chmod)
if self.remove_existing_in_dir and ins:
self._remove_existing_in_dir(dst, ins)
else:
@@ -186,40 +192,57 @@ class Installer:
if self.dry:
return self._log_install(ret, err)
# handle chmod
# - on success (r, not err)
# - no change (not r, not err)
# but not when
# - error (not r, err)
# - aborted (not r, err)
# - special keyword "preserve"
self._apply_chmod_after_install(src, dst, ret, err,
chmod=chmod,
force_chmod=force_chmod,
linktype=linktype)
return self._log_install(ret, err)
def _apply_chmod_after_install(self, src, dst, ret, err,
chmod=None,
is_sub=False,
force_chmod=False,
linktype=LinkTypes.NOLINK):
"""
handle chmod after install
- on success (r, not err)
- no change (not r, not err)
but not when
- error (not r, err)
- aborted (not r, err)
- special keyword "preserve"
is_sub is used to specify if the file/dir is
part of a dotfile directory
"""
apply_chmod = linktype in [LinkTypes.NOLINK, LinkTypes.LINK_CHILDREN]
apply_chmod = apply_chmod and os.path.exists(dst)
apply_chmod = apply_chmod and (ret or (not ret and not err))
apply_chmod = apply_chmod and chmod != CfgYaml.chmod_ignore
if apply_chmod:
if not chmod:
chmod = get_file_perm(src)
self.log.dbg(f'applying chmod {chmod:o} to {dst}')
dstperms = get_file_perm(dst)
if dstperms != chmod:
# apply mode
msg = f'chmod {dst} to {chmod:o}'
if not force_chmod and self.safe and not self.log.ask(msg):
ret = False
err = 'aborted'
else:
if not self.comparing:
self.log.sub(f'chmod {dst} to {chmod:o}')
if chmodit(dst, chmod, debug=self.debug):
ret = True
else:
ret = False
err = 'chmod failed'
else:
if is_sub:
chmod = None
if not apply_chmod:
self.log.dbg('no chmod applied')
return self._log_install(ret, err)
return
if not chmod:
chmod = get_file_perm(src)
self.log.dbg(f'dotfile in dotpath perm: {chmod:o}')
self.log.dbg(f'applying chmod {chmod:o} to {dst}')
dstperms = get_file_perm(dst)
if dstperms != chmod:
# apply mode
msg = f'chmod {dst} to {chmod:o}'
if not force_chmod and self.safe and not self.log.ask(msg):
ret = False
err = 'aborted'
else:
if not self.comparing:
self.log.sub(f'chmod {dst} to {chmod:o}')
if chmodit(dst, chmod, debug=self.debug):
ret = True
else:
ret = False
err = 'chmod failed'
def install_to_temp(self, templater, tmpdir, src, dst,
is_template=True, chmod=None, ignore=None,
@@ -465,6 +488,8 @@ class Installer:
return False, 'aborted'
# remove symlink
if self.backup and not os.path.isdir(dst):
self._backup(dst)
overwrite = True
try:
removepath(dst)
@@ -551,6 +576,7 @@ class Installer:
content = None
if is_template:
# template the file
self.log.dbg(f'it is a template: {src}')
saved = templater.add_tmp_vars(self._get_tmp_file_vars(src, dst))
try:
content = templater.generate(src)
@@ -580,7 +606,8 @@ class Installer:
def _copy_dir(self, templater, src, dst,
actionexec=None, noempty=False,
ignore=None, is_template=True):
ignore=None, is_template=True,
chmod=None):
"""
install src to dst when is a directory
@@ -617,6 +644,9 @@ class Installer:
# error occured
return res, err, []
self._apply_chmod_after_install(fpath, fdst, ret, err,
chmod=chmod, is_sub=True)
if res:
# something got installed
@@ -720,6 +750,7 @@ class Installer:
if os.path.lexists(dst):
# file/symlink exists
self.log.dbg(f'file already exists on filesystem: {dst}')
try:
os.stat(dst)
except OSError as exc:
@@ -745,6 +776,8 @@ class Installer:
if self.backup:
self._backup(dst)
else:
self.log.dbg(f'file does not exist on filesystem: {dst}')
# create hierarchy
base = os.path.dirname(dst)

View File

@@ -68,6 +68,7 @@ Usage:
dotdrop update [-VbfdkPz] [-c <path>] [-p <profile>]
[-w <nb>] [-i <pattern>...] [<path>...]
dotdrop remove [-Vbfdk] [-c <path>] [-p <profile>] [<path>...]
dotdrop uninstall [-Vbfd] [-c <path>] [-p <profile>] [<key>...]
dotdrop files [-VbTG] [-c <path>] [-p <profile>]
dotdrop detail [-Vb] [-c <path>] [-p <profile>] [<key>...]
dotdrop profiles [-VbG] [-c <path>]
@@ -93,8 +94,8 @@ Options:
-P --show-patch Provide a one-liner to manually patch template.
-R --remove-existing Remove existing file on install directory.
-s --as=<path> Import as a different path from actual path.
--transr=<key> Associate trans_read key on import.
--transw=<key> Apply trans_write key on import.
--transr=<key> Associate trans_install key on import.
--transw=<key> Apply trans_update key on import.
-t --temp Install to a temporary directory for review.
-T --template Only template dotfiles.
-V --verbose Be verbose.
@@ -320,8 +321,8 @@ class Options(AttrMonitor):
self.import_ignore.extend(self.impignore)
self.import_ignore.append(f'*{self.install_backup_suffix}')
self.import_ignore = uniq_list(self.import_ignore)
self.import_transw = self.args['--transw']
self.import_transr = self.args['--transr']
self.import_trans_install = self.args['--transr']
self.import_trans_update = self.args['--transw']
def _apply_args_update(self):
"""update specifics"""
@@ -342,6 +343,10 @@ class Options(AttrMonitor):
self.remove_path = self.args['<path>']
self.remove_iskey = self.args['--key']
def _apply_args_uninstall(self):
"""uninstall specifics"""
self.uninstall_key = self.args['<key>']
def _apply_args_detail(self):
"""detail specifics"""
self.detail_keys = self.args['<key>']
@@ -357,6 +362,7 @@ class Options(AttrMonitor):
self.cmd_update = self.args['update']
self.cmd_detail = self.args['detail']
self.cmd_remove = self.args['remove']
self.cmd_uninstall = self.args['uninstall']
# adapt attributes based on arguments
self.safe = not self.args['--force']
@@ -405,6 +411,9 @@ class Options(AttrMonitor):
# "remove" specifics
self._apply_args_remove()
# "uninstall" specifics
self._apply_args_uninstall()
def _fill_attr(self):
"""create attributes from conf"""
# defined variables

150
dotdrop/uninstaller.py Normal file
View File

@@ -0,0 +1,150 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2023, deadc0de6
handle the un-installation of dotfiles
"""
import os
from dotdrop.logger import Logger
from dotdrop.utils import removepath
class Uninstaller:
"""dotfile uninstaller"""
def __init__(self, base='.', workdir='~/.config/dotdrop',
dry=False, safe=True, debug=False,
backup_suffix='.dotdropbak'):
"""
@base: directory path where to search for templates
@workdir: where to install template before symlinking
@dry: just simulate
@debug: enable debug
@backup_suffix: suffix for dotfile backup file
@safe: ask for any action
"""
base = os.path.expanduser(base)
base = os.path.normpath(base)
self.base = base
workdir = os.path.expanduser(workdir)
workdir = os.path.normpath(workdir)
self.workdir = workdir
self.dry = dry
self.safe = safe
self.debug = debug
self.backup_suffix = backup_suffix
self.log = Logger(debug=self.debug)
def uninstall(self, src, dst, linktype):
"""
uninstall dst
@src: dotfile source path in dotpath
@dst: dotfile destination path in the FS
@linktype: linktypes.LinkTypes
return
- True, None : success
- False, error_msg : error
"""
if not src or not dst:
self.log.dbg(f'cannot uninstall empty {src} or {dst}')
return True, None
# ensure exists
path = os.path.expanduser(dst)
path = os.path.normpath(path)
path = path.rstrip(os.sep)
if not os.path.isfile(path) and not os.path.isdir(path):
msg = f'cannot uninstall special file {path}'
return False, msg
if not os.path.exists(path):
self.log.dbg(f'cannot uninstall non existing {path}')
return True, None
msg = f'uninstalling \"{path}\" (link: {linktype})'
self.log.dbg(msg)
ret, msg = self._remove(path)
if ret:
if not self.dry:
self.log.sub(f'uninstall {dst}')
return ret, msg
def _descend(self, dirpath):
ret = True
self.log.dbg(f'recursively uninstall {dirpath}')
for sub in os.listdir(dirpath):
subpath = os.path.join(dirpath, sub)
if os.path.isdir(subpath):
self.log.dbg(f'under {dirpath} uninstall dir {subpath}')
self._descend(subpath)
else:
self.log.dbg(f'under {dirpath} uninstall file {subpath}')
subret, _ = self._remove(subpath)
if not subret:
ret = False
if not os.listdir(dirpath):
# empty
self.log.dbg(f'remove empty dir {dirpath}')
if self.dry:
self.log.dry(f'would \"rm -r {dirpath}\"')
return True, ''
return self._remove_path(dirpath)
self.log.dbg(f'not removing non-empty dir {dirpath}')
return ret, ''
def _remove_path(self, path):
"""remove a file"""
try:
removepath(path, self.log)
except OSError as exc:
err = f'removing \"{path}\" failed: {exc}'
return False, err
return True, ''
def _remove(self, path):
"""remove path"""
self.log.dbg(f'handling uninstall of {path}')
if path.endswith(self.backup_suffix):
self.log.dbg(f'skip {path} ignored')
return True, ''
backup = f'{path}{self.backup_suffix}'
if os.path.exists(backup):
self.log.dbg(f'backup exists for {path}: {backup}')
return self._replace(path, backup)
self.log.dbg(f'no backup file for {path}')
if os.path.isdir(path):
self.log.dbg(f'{path} is a directory')
return self._descend(path)
if self.dry:
self.log.dry(f'would \"rm {path}\"')
return True, ''
msg = f'Remove {path}?'
if self.safe and not self.log.ask(msg):
return False, 'user refused'
self.log.dbg(f'removing {path}')
return self._remove_path(path)
def _replace(self, path, backup):
"""replace path by backup"""
if self.dry:
self.log.dry(f'would \"mv {backup} {path}\"')
return True, ''
msg = f'Restore {path} from {backup}?'
if self.safe and not self.log.ask(msg):
return False, 'user refused'
try:
self.log.dbg(f'mv {backup} {path}')
os.replace(backup, path)
except OSError as exc:
err = f'replacing \"{path}\" by \"{backup}\" failed: {exc}'
return False, err
return True, ''

View File

@@ -122,7 +122,7 @@ class Updater:
return True
# apply write transformation if any
new_path = self._apply_trans_w(deployed_path, dotfile)
new_path = self._apply_trans_update(deployed_path, dotfile)
if not new_path:
return False
@@ -150,9 +150,9 @@ class Updater:
removepath(new_path, logger=self.log)
return ret
def _apply_trans_w(self, path, dotfile):
def _apply_trans_update(self, path, dotfile):
"""apply write transformation to dotfile"""
trans = dotfile.get_trans_w()
trans = dotfile.get_trans_update()
if not trans:
return path
self.log.dbg(f'executing write transformation {trans}')

View File

@@ -175,6 +175,8 @@ def removepath(path, logger=None):
return
LOG.err(err)
raise OSError(err)
if logger:
logger.dbg(f'removing {path}')
try:
if os.path.islink(path) or os.path.isfile(path):
os.unlink(path)

4
manpage/dotdrop.1 vendored
View File

@@ -105,11 +105,11 @@ Import as a different path from actual path.
.TP
.B
\fB--transr\fP=<key>
Associate trans_read key on import.
Associate trans_install key on import.
.TP
.B
\fB--transw\fP=<key>
Apply trans_write key on import.
Apply trans_update key on import.
.RE
.TP
.B

View File

@@ -39,8 +39,8 @@ COMMANDS
-m --preserve-mode Insert a chmod entry in the dotfile with its mode.
-p --profile=<profile> Specify the profile to use.
-s --as=<path> Import as a different path from actual path.
--transr=<key> Associate trans_read key on import.
--transw=<key> Apply trans_write key on import.
--transr=<key> Associate trans_install key on import.
--transw=<key> Apply trans_update key on import.
compare Compare dotfiles
-C --file=<path> Path of dotfile to compare.

View File

@@ -38,10 +38,12 @@ pyflakes --version
# checking for TODO/FIXME
echo "--------------------------------------"
echo "checking for TODO/FIXME"
grep -rv 'TODO\|FIXME' dotdrop/ >/dev/null 2>&1
grep -rv 'TODO\|FIXME' tests/ >/dev/null 2>&1
grep -rv 'TODO\|FIXME' tests-ng/ >/dev/null 2>&1
grep -rv 'TODO\|FIXME' scripts/ >/dev/null 2>&1
set +e
grep -r 'TODO\|FIXME' dotdrop/ && exit 1
grep -r 'TODO\|FIXME' tests/ && exit 1
grep -r 'TODO\|FIXME' tests-ng/ && exit 1
#grep -r 'TODO\|FIXME' scripts/ && exit 1
set -e
# checking for tests options
echo "---------------------------------"
@@ -111,7 +113,7 @@ done
# check other python scripts
echo "-----------------------------------------"
echo "checking other python scripts with pylint"
find . -name "*.py" -not -path "./dotdrop/*" | while read -r script; do
find . -name "*.py" -not -path "./dotdrop/*" -not -regex "\./\.?v?env/.*" | while read -r script; do
echo "checking ${script}"
pylint -sn \
--disable=R0914 \

View File

@@ -12,4 +12,4 @@ if [ -n "${WORKERS}" ]; then
fi
mkdir -p coverages/
coverage run -p --data-file coverages/coverage -m pytest tests
coverage run -p --data-file coverages/coverage -m pytest tests -x

View File

@@ -31,7 +31,10 @@ echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sg
# $2 path
grep_or_fail()
{
grep "${1}" "${2}" >/dev/null 2>&1 || (echo "pattern not found in ${2}" && exit 1)
if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "pattern not found in ${2}"
exit 1
fi
}
# the action temp

248
tests-ng/backup.sh vendored Executable file
View File

@@ -0,0 +1,248 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test for backups
# returns 1 in case of error
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 pattern
# $2 path
grep_or_fail()
{
if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "pattern \"${1}\" not found in ${2}"
exit 1
fi
}
# the dotfile source
tmps=$(mktemp -d --suffix='-dotdrop-tests-dotpath' || mktemp -d)
mkdir -p "${tmps}"/dotfiles
# the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests-dst' || mktemp -d)
tmpw=$(mktemp -d --suffix='-dotdrop-workdir' || mktemp -d)
clear_on_exit "${tmps}"
clear_on_exit "${tmpd}"
clear_on_exit "${tmpw}"
clear_dotpath()
{
rm -rf "${tmps:?}"/dotfiles/*
}
create_dotpath()
{
# create the dotfiles in dotpath
echo "modified" > "${tmps}"/dotfiles/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/template
mkdir -p "${tmps}"/dotfiles/dir
echo "modified" > "${tmps}"/dotfiles/dir/sub
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/dir/template
mkdir -p "${tmps}"/dotfiles/tree
echo "modified" > "${tmps}"/dotfiles/tree/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/tree/template
mkdir -p "${tmps}"/dotfiles/tree/sub
echo "modified" > "${tmps}"/dotfiles/tree/sub/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/tree/sub/template
}
clear_fs()
{
rm -rf "${tmpd:?}"/*
}
create_fs()
{
# create the existing dotfiles in filesystem
echo "original" > "${tmpd}"/file
echo "original" > "${tmpd}"/template
mkdir -p "${tmpd}"/dir
echo "original" > "${tmpd}"/dir/sub
echo "original" > "${tmpd}"/dir/template
mkdir -p "${tmpd}"/tree
echo "original" > "${tmpd}"/tree/file
echo "original" > "${tmpd}"/tree/template
mkdir -p "${tmpd}"/tree/sub
echo "original" > "${tmpd}"/tree/sub/file
echo "original" > "${tmpd}"/tree/sub/template
}
# create the config file
cfg="${tmps}/config.yaml"
# $1: linktype
create_config()
{
link_default="${1}"
link_file="${1}"
link_dir="${1}"
if [ "${link_default}" = "link_children" ]; then
link_file="nolink"
fi
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: ${link_default}
workdir: ${tmpw}
dotfiles:
f_file:
dst: ${tmpd}/file
src: file
link: ${link_file}
f_template:
dst: ${tmpd}/template
src: template
link: ${link_file}
d_dir:
dst: ${tmpd}/dir
src: dir
link: ${link_dir}
d_tree:
dst: ${tmpd}/tree
src: tree
link: ${link_dir}
profiles:
p1:
dotfiles:
- f_file
- f_template
- d_dir
- d_tree
_EOF
#cat ${cfg}
}
# install nolink
pre="link:nolink"
create_config "nolink"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
[ ! -e "${tmpd}"/dir/sub.dotdropbak ] && echo "${pre} dir sub backup not found" && exit 1
[ ! -e "${tmpd}"/dir/template.dotdropbak ] && echo "${pre} dir template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/file.dotdropbak ] && echo "${pre} tree file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/template.dotdropbak ] && echo "${pre} tree template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/sub/file.dotdropbak ] && echo "${pre} tree sub file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/sub/template.dotdropbak ] && echo "${pre} tree sub template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail original "${tmpd}"/dir/sub.dotdropbak
grep_or_fail original "${tmpd}"/dir/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/sub/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/sub/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install relative
pre="link:relative"
create_config "relative"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install absolute
pre="link:absolute"
create_config "absolute"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install link_children
pre="link:link_children"
create_config "link_children"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
[ ! -e "${tmpd}"/dir/sub.dotdropbak ] && echo "${pre} dir sub backup not found" && exit 1
[ ! -e "${tmpd}"/dir/template.dotdropbak ] && echo "${pre} dir template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/file.dotdropbak ] && echo "${pre} tree file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/template.dotdropbak ] && echo "${pre} tree template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail original "${tmpd}"/dir/sub.dotdropbak
grep_or_fail original "${tmpd}"/dir/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
echo "OK"
exit 0

103
tests-ng/chmod-install-dir.sh vendored Executable file
View File

@@ -0,0 +1,103 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test chmod dir sub file on install
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 path
# $2 rights
has_rights()
{
echo "testing ${1} is ${2}"
[ ! -e "$1" ] && echo "$(basename "$1") does not exist" && exit 1
local mode
mode=$(stat -L -c '%a' "$1")
[ "${mode}" != "$2" ] && echo "bad mode for $(basename "$1") (${mode} VS expected ${2})" && exit 1
true
}
# the dotfile source
tmps=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
mkdir -p "${tmps}"/dotfiles
# the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
#echo "dotfile destination: ${tmpd}"
clear_on_exit "${tmps}"
clear_on_exit "${tmpd}"
# create the config file
cfg="${tmps}/config.yaml"
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
force_chmod: true
dotfiles:
d_dir:
src: dir
dst: ${tmpd}/dir
profiles:
p1:
dotfiles:
- d_dir
_EOF
#cat ${cfg}
mkdir -p "${tmps}"/dotfiles/dir
echo 'file1' > "${tmps}"/dotfiles/dir/file1
chmod 700 "${tmps}"/dotfiles/dir/file1
echo 'file2' > "${tmps}"/dotfiles/dir/file2
chmod 777 "${tmps}"/dotfiles/dir/file2
echo 'file3' > "${tmps}"/dotfiles/dir/file3
chmod 644 "${tmps}"/dotfiles/dir/file3
ls -l "${tmps}"/dotfiles/dir/
# install
echo "install (1)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V
has_rights "${tmpd}/dir/file1" "700"
has_rights "${tmpd}/dir/file2" "777"
has_rights "${tmpd}/dir/file3" "644"
# modify
chmod 666 "${tmpd}/dir/file1"
chmod 666 "${tmpd}/dir/file2"
chmod 666 "${tmpd}/dir/file3"
# install
echo "install (2)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V
has_rights "${tmpd}/dir/file1" "700"
has_rights "${tmpd}/dir/file2" "777"
has_rights "${tmpd}/dir/file3" "644"
echo "OK"
exit 0

View File

@@ -61,42 +61,6 @@ clear_on_exit "${tmpd}"
# create the config file
cfg="${tmps}/config.yaml"
echo 'f777' > "${tmps}"/dotfiles/f777
chmod 700 "${tmps}"/dotfiles/f777
echo 'link' > "${tmps}"/dotfiles/link
chmod 777 "${tmps}"/dotfiles/link
mkdir -p "${tmps}"/dotfiles/dir
echo "f1" > "${tmps}"/dotfiles/dir/f1
echo "exists" > "${tmps}"/dotfiles/exists
chmod 644 "${tmps}"/dotfiles/exists
echo "exists" > "${tmpd}"/exists
chmod 644 "${tmpd}"/exists
echo "existslink" > "${tmps}"/dotfiles/existslink
chmod 777 "${tmps}"/dotfiles/existslink
chmod 644 "${tmpd}"/exists
mkdir -p "${tmps}"/dotfiles/direxists
echo "f1" > "${tmps}"/dotfiles/direxists/f1
mkdir -p "${tmpd}"/direxists
echo "f1" > "${tmpd}"/direxists/f1
chmod 644 "${tmpd}"/direxists/f1
chmod 744 "${tmpd}"/direxists
mkdir -p "${tmps}"/dotfiles/linkchildren
echo "f1" > "${tmps}"/dotfiles/linkchildren/f1
mkdir -p "${tmps}"/dotfiles/linkchildren/d1
echo "f2" > "${tmps}"/dotfiles/linkchildren/d1/f2
echo '{{@@ profile @@}}' > "${tmps}"/dotfiles/symlinktemplate
mkdir -p "${tmps}"/dotfiles/symlinktemplatedir
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/symlinktemplatedir/t
echo 'nomode' > "${tmps}"/dotfiles/nomode
cat > "${cfg}" << _EOF
config:
backup: true
@@ -170,6 +134,42 @@ profiles:
_EOF
#cat ${cfg}
# create the dotfiles
echo 'f777' > "${tmps}"/dotfiles/f777
chmod 700 "${tmps}"/dotfiles/f777
echo 'link' > "${tmps}"/dotfiles/link
chmod 777 "${tmps}"/dotfiles/link
mkdir -p "${tmps}"/dotfiles/dir
echo "f1" > "${tmps}"/dotfiles/dir/f1
echo "exists" > "${tmps}"/dotfiles/exists
chmod 644 "${tmps}"/dotfiles/exists
echo "exists" > "${tmpd}"/exists
chmod 644 "${tmpd}"/exists
echo "existslink" > "${tmps}"/dotfiles/existslink
chmod 777 "${tmps}"/dotfiles/existslink
chmod 644 "${tmpd}"/exists
mkdir -p "${tmps}"/dotfiles/direxists
echo "f1" > "${tmps}"/dotfiles/direxists/f1
mkdir -p "${tmpd}"/direxists
echo "f1" > "${tmpd}"/direxists/f1
chmod 644 "${tmpd}"/direxists/f1
chmod 744 "${tmpd}"/direxists
mkdir -p "${tmps}"/dotfiles/linkchildren
echo "f1" > "${tmps}"/dotfiles/linkchildren/f1
mkdir -p "${tmps}"/dotfiles/linkchildren/d1
echo "f2" > "${tmps}"/dotfiles/linkchildren/d1/f2
echo '{{@@ profile @@}}' > "${tmps}"/dotfiles/symlinktemplate
mkdir -p "${tmps}"/dotfiles/symlinktemplatedir
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/symlinktemplatedir/t
echo 'nomode' > "${tmps}"/dotfiles/nomode
# install
echo "first install round"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V

View File

@@ -34,19 +34,21 @@ echo "dotfiles source (dotpath): ${tmps}"
# the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
echo "dotfiles destination: ${tmpd}"
tmptmp=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
clear_on_exit "${tmps}"
clear_on_exit "${tmpd}"
clear_on_exit "${tmptmp}"
# create the config file
cfg="${tmps}/config.yaml"
cat > "${cfg}" << _EOF
trans_read:
trans_install:
base64: "cat {0} | base64 -d > {1}"
decompress: "mkdir -p {1} && tar -xf {0} -C {1}"
decrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write:
trans_update:
base64: "cat {0} | base64 > {1}"
compress: "tar -cf {1} -C {0} ."
encrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
@@ -90,16 +92,16 @@ cd "${ddpath}" | ${bin} import -f -c "${cfg}" -p p1 -b -V --transw=encrypt --tra
# check content in dotpath
echo "checking content"
file "${tmps}"/dotfiles/"${tmpd}"/abc | grep -i 'text'
cat "${tmpd}"/abc | base64 > "${tmps}"/test-abc
diff "${tmps}"/dotfiles/"${tmpd}"/abc "${tmps}"/test-abc
cat "${tmpd}"/abc | base64 > "${tmptmp}"/test-abc
diff "${tmps}"/dotfiles/"${tmpd}"/abc "${tmptmp}"/test-abc
file "${tmps}"/dotfiles/"${tmpd}"/def | grep -i 'tar'
tar -cf "${tmps}"/test-def -C "${tmpd}"/def .
diff "${tmps}"/dotfiles/"${tmpd}"/def "${tmps}"/test-def
tar -cf "${tmptmp}"/test-def -C "${tmpd}"/def .
diff "${tmps}"/dotfiles/"${tmpd}"/def "${tmptmp}"/test-def
file "${tmps}"/dotfiles/"${tmpd}"/ghi | grep -i 'gpg symmetrically encrypted data\|PGP symmetric key encrypted data'
echo p1 | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d "${tmps}"/dotfiles/"${tmpd}"/ghi > "${tmps}"/test-ghi
diff "${tmps}"/test-ghi "${tmpd}"/ghi
echo p1 | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d "${tmps}"/dotfiles/"${tmpd}"/ghi > "${tmptmp}"/test-ghi
diff "${tmptmp}"/test-ghi "${tmpd}"/ghi
# check is imported in config
echo "checking imported in config"
@@ -108,33 +110,33 @@ cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^f_abc'
cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^d_def'
cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^f_ghi'
# check has trans_write and trans_read in config
echo "checking trans_write is set in config"
# check has trans_update and trans_install in config
echo "checking trans_update is set in config"
echo "--------------"
cat "${cfg}"
echo "--------------"
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_write: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_write: compress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_write: encrypt'
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_update: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_update: compress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_update: encrypt'
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_read: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_read: decompress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_read: decrypt'
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_install: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_install: decompress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_install: decrypt'
# install these
echo "install and check"
rm "${tmpd}"/abc
rm -r "${tmpd}"/def
rm "${tmpd}"/ghi
rm -rf "${tmpd:?}"/*
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 -b -V
# test exist
echo "check exist"
[ ! -e "${tmpd}"/abc ] && exit 1
[ ! -d "${tmpd}"/def/a ] && exit 1
[ ! -e "${tmpd}"/def/a/file ] && exit 1
[ ! -e "${tmpd}"/ghi ] && exit 1
cat "${cfg}"
tree "${tmpd}"
[ ! -e "${tmpd}"/abc ] && echo "${tmpd}/abc does not exist" && exit 1
[ ! -d "${tmpd}"/def/a ] && echo "${tmpd}/def/a does not exist" && exit 1
[ ! -e "${tmpd}"/def/a/file ] && echo "${tmpd}/def/a/file does not exist" && exit 1
[ ! -e "${tmpd}"/ghi ] && echo "${tmpd}/ghi does not exist" && exit 1
# test content
echo "check content"

View File

@@ -108,8 +108,10 @@ def run_tests(max_jobs=None, stop_on_first_err=True, with_spinner=True):
failed += 1
print()
if stop_on_first_err:
print(log_out)
print(log_err)
if log_out:
print(log_out)
if log_err:
print(log_err)
print(f'test \"{name}\" failed ({ret}): {reason}')
if stop_on_first_err:
ex.shutdown(wait=False)

303
tests-ng/uninstall.sh vendored Executable file
View File

@@ -0,0 +1,303 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test uninstall (no symlink)
# returns 1 in case of error
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 pattern
# $2 path
grep_or_fail()
{
if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "${PRE} pattern \"${1}\" not found in ${2}"
exit 1
fi
}
# $1: basedir
# $2: content
create_hierarchy()
{
echo "${2}" > "${1}"/x
mkdir -p "${1}"/y
echo "${2}" > "${1}"/y/file
mkdir -p "${1}"/y/subdir
echo "${2}" > "${1}"/y/subdir/subfile
echo "profile: ${PRO_TEMPL}" > "${1}"/t
mkdir -p "${1}"/z
echo "profile t1: ${PRO_TEMPL}" > "${1}"/z/t1
echo "profile t2: ${PRO_TEMPL}" > "${1}"/z/t2
echo "${2}" > "${1}"/z/file
echo "trans:${PRO_TEMPL}" > "${1}"/trans
}
# $1: basedir
clean_hierarchy()
{
rm -rf "${1:?}"/*
}
uninstall_with_link()
{
set -e
LINK_TYPE="${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE:-nolink}"
PRE="[link:${LINK_TYPE}] ERROR"
PRO_TEMPL="{{@@ profile @@}}"
DT_ARG="--verbose"
# dotdrop directory
basedir=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
mkdir -p "${basedir}"/dotfiles
echo "[+] dotdrop dir: ${basedir}"
echo "[+] dotpath dir: ${basedir}/dotfiles"
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
tmpw=$(mktemp -d --suffix='-dotdrop-workdir' || mktemp -d)
clear_on_exit "${basedir}/dotfiles"
clear_on_exit "${tmpd}"
clear_on_exit "${tmpw}"
file_link="${LINK_TYPE}"
dir_link="${LINK_TYPE}"
if [ "${LINK_TYPE}" = "link_children" ]; then
file_link="absolute"
fi
# create the config file
cfg="${basedir}/config.yaml"
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: ${LINK_TYPE}
workdir: ${tmpw}
dotfiles:
f_x:
src: x
dst: ${tmpd}/x
link: ${file_link}
d_y:
src: y
dst: ${tmpd}/y
link: ${dir_link}
f_t:
src: t
dst: ${tmpd}/t
link: ${file_link}
d_z:
src: z
dst: ${tmpd}/z
link: ${dir_link}
f_trans:
src: trans
dst: ${tmpd}/trans
link: ${file_link}
profiles:
p1:
dotfiles:
- f_x
- d_y
- f_t
- d_z
- f_trans
_EOF
#########################
## no original
#########################
create_hierarchy "${basedir}/dotfiles" "modified"
# install
echo "[+] install (1)"
( \
cd "${ddpath}" && ${bin} install -c "${cfg}" -f -p p1 | grep '^4 dotfile(s) installed.$' \
)
# tests
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x not installed" && exit 1
[ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not installed" && exit 1
[ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not installed" && exit 1
[ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not installed" && exit 1
[ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not installed" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not installed" && exit 1
grep_or_fail 'modified' "${tmpd}"/x
grep_or_fail 'modified' "${tmpd}"/y/file
grep_or_fail 'profile: p1' "${tmpd}"/t
grep_or_fail 'profile t1: p1' "${tmpd}"/z/t1
grep_or_fail 'profile t2: p1' "${tmpd}"/z/t2
grep_or_fail 'modified' "${tmpd}"/z/file
grep_or_fail 'trans:p1' "${tmpd}"/trans
# uninstall
echo "[+] uninstall (1)"
( \
cd "${ddpath}" && ${bin} uninstall -c "${cfg}" -f -p p1 "${DT_ARG}" \
)
[ "$?" != "0" ] && exit 1
# tests
[ ! -d "${basedir}"/dotfiles ] && echo "${PRE} dotpath removed" && exit 1
[ -e "${tmpd}"/x ] && echo "${PRE} f_x not uninstalled" && exit 1
[ -d "${tmpd}"/y ] && echo "${PRE} d_y dir not uninstalled" && exit 1
[ -e "${tmpd}"/y/file ] && echo "${PRE} d_y file not uninstalled" && exit 1
[ -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y subfile not uninstalled" && exit 1
[ -e "${tmpd}"/t ] && echo "${PRE} f_t not uninstalled" && exit 1
[ -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z subfile t1 not uninstalled" && exit 1
[ -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z subfile t2 not uninstalled" && exit 1
[ -e "${tmpd}"/z/file ] && echo "${PRE} d_z subfile file not uninstalled" && exit 1
[ -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not uninstalled" && exit 1
# test workdir is empty
if [ -n "$(ls -A "${tmpw}")" ]; then
echo "${PRE} workdir (1) is not empty"
echo "---"
ls -A "${tmpw}"
echo "---"
exit 1
fi
#########################
## with original
#########################
# clean
clean_hierarchy "${tmpd}"
clean_hierarchy "${basedir}"/dotfiles
# recreate
create_hierarchy "${basedir}"/dotfiles "modified"
create_hierarchy "${tmpd}" "original"
# install
echo "[+] install (2)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 | grep '^4 dotfile(s) installed.$'
# tests
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x not installed" && exit 1
[ ! -e "${tmpd}"/x.dotdropbak ] && echo "${PRE} f_x backup not created" && exit 1
[ ! -d "${tmpd}"/y ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y file not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/file.dotdropbak ] && echo "${PRE} d_y backup file not created" && exit 1
[ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y subfile not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/subdir/subfile.dotdropbak ] && echo "${PRE} d_y subfile backup not created" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not installed" && exit 1
[ ! -e "${tmpd}"/t.dotdropbak ] && echo "${PRE} f_t backup not created" && exit 1
[ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t1.dotdropbak ] && echo "${PRE} d_z t1 backup not created" && exit 1
[ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t2.dotdropbak ] && echo "${PRE} d_z t2 backup not created" && exit 1
[ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/file.dotdropbak ] && echo "${PRE} d_z backup file not created" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not installed" && exit 1
[ ! -e "${tmpd}"/trans.dotdropbak ] && echo "${PRE} f_trans backup file not created" && exit 1
grep_or_fail 'modified' "${tmpd}"/x
grep_or_fail 'modified' "${tmpd}"/y/file
grep_or_fail 'profile: p1' "${tmpd}"/t
grep_or_fail 'profile t1: p1' "${tmpd}"/z/t1
grep_or_fail 'profile t2: p1' "${tmpd}"/z/t2
grep_or_fail 'modified' "${tmpd}"/z/file
grep_or_fail 'trans:p1' "${tmpd}"/trans
# uninstall
echo "[+] uninstall (2)"
( \
cd "${ddpath}" && ${bin} uninstall -c "${cfg}" -f -p p1 "${DT_ARG}" \
)
[ "$?" != "0" ] && exit 1
# tests
[ ! -d "${basedir}"/dotfiles ] && echo "${PRE} dotpath removed" && exit 1
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x backup not restored" && exit 1
[ -e "${tmpd}"/x.dotdropbak ] && echo "${PRE} f_x backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -d "${tmpd}"/y ] && echo "${PRE} d_y backup not restored" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y file backup not restored" && exit 1
[ -e "${tmpd}"/y/file.dotdropbak ] && echo "${PRE} d_y backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y sub backup not restored" && exit 1
[ -e "${tmpd}"/y/subdir/subfile.dotdropbak ] && echo "${PRE} d_y sub backup not removed" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not restored" && exit 1
[ -e "${tmpd}"/t.dotdropbak ] && echo "${PRE} f_t backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not restore" && exit 1
[ -e "${tmpd}"/z/t1.dotdropbak ] && echo "${PRE} d_z t1 backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not restored" && exit 1
[ -e "${tmpd}"/z/t2.dotdropbak ] && echo "${PRE} d_z t2 backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not restored" && exit 1
[ -e "${tmpd}"/z/file.dotdropbak ] && echo "${PRE} d_z file backup not removed" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans backup not restored" && exit 1
[ -e "${tmpd}"/trans.dotdropbak ] && echo "${PRE} f_trans backup not removed" && exit 1
grep_or_fail 'original' "${tmpd}"/x
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail 'original' "${tmpd}"/y/file
grep_or_fail "profile: ${PRO_TEMPL}" "${tmpd}/t"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail "profile t1: ${PRO_TEMPL}" "${tmpd}/z/t1"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail "profile t2: ${PRO_TEMPL}" "${tmpd}/z/t2"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail 'original' "${tmpd}"/z/file
grep_or_fail "trans:${PRO_TEMPL}" "${tmpd}"/trans
echo "testing workdir..."
# test workdir is empty
if [ -n "$(ls -A "${tmpw}")" ]; then
echo "${PRE} workdir (2) - ${tmpw} - is not empty"
ls -r "${tmpw}"
exit 1
fi
echo "${PRE} done OK"
}
export DOTDROP_TEST_NG_UNINSTALL_DDPATH="${ddpath}"
export DOTDROP_TEST_NG_UNINSTALL_BIN="${bin}"
export DOTDROP_TEST_NG_CUR="${cur}"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="nolink"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="absolute"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="relative"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="link_children"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
echo "OK"
exit 0

View File

@@ -149,6 +149,7 @@ def _fake_args():
args['profiles'] = False
args['files'] = False
args['install'] = False
args['uninstall'] = False
args['compare'] = False
args['import'] = False
args['update'] = False
@@ -247,7 +248,7 @@ def create_yaml_keyval(pairs, parent_dir=None, top_key=None):
# pylint: disable=W0102
def populate_fake_config(config, dotfiles={}, profiles={}, actions={},
trans={}, trans_write={}, variables={},
trans_install={}, trans_update={}, variables={},
dynvariables={}):
"""Adds some juicy content to config files"""
is_path = isinstance(config, str)
@@ -258,8 +259,8 @@ def populate_fake_config(config, dotfiles={}, profiles={}, actions={},
config['dotfiles'] = dotfiles
config['profiles'] = profiles
config['actions'] = actions
config['trans_read'] = trans
config['trans_write'] = trans_write
config['trans_install'] = trans_install
config['trans_update'] = trans_update
config['variables'] = variables
config['dynvariables'] = dynvariables

View File

@@ -239,10 +239,10 @@ class TestImport(unittest.TestCase):
},
'a_log_ed': 'echo 2',
},
'trans': {
'trans_install': {
't_log_ed': 'echo 3',
},
'trans_write': {
'trans_update': {
'tw_log_ed': 'echo 4',
},
'variables': {
@@ -273,10 +273,10 @@ class TestImport(unittest.TestCase):
},
'a_log_ing': 'echo a',
},
'trans': {
'trans_install': {
't_log_ing': 'echo b',
},
'trans_write': {
'trans_update': {
'tw_log_ing': 'echo c',
},
'variables': {
@@ -352,10 +352,10 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(a.endswith('ing') for a in actions))
# testing transformations
transformations = ycont['trans_read'].keys()
transformations = ycont['trans_install'].keys()
self.assertTrue(all(t.endswith('ed') for t in transformations))
self.assertFalse(any(t.endswith('ing') for t in transformations))
transformations = ycont['trans_write'].keys()
transformations = ycont['trans_update'].keys()
self.assertTrue(all(t.endswith('ed') for t in transformations))
self.assertFalse(any(t.endswith('ing') for t in transformations))
@@ -394,10 +394,10 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(action.endswith('ed') for action in actions))
# testing transformations
transformations = ycont['trans_read'].keys()
transformations = ycont['trans_install'].keys()
self.assertTrue(all(t.endswith('ing') for t in transformations))
self.assertFalse(any(t.endswith('ed') for t in transformations))
transformations = ycont['trans_write'].keys()
transformations = ycont['trans_update'].keys()
self.assertTrue(all(t.endswith('ing') for t in transformations))
self.assertFalse(any(t.endswith('ed') for t in transformations))

View File

@@ -28,7 +28,7 @@ def fake_config(path, dotfiles, profile,
file.write('actions:\n')
for action in actions:
file.write(f' {action.key}: {action.action}\n')
file.write('trans:\n')
file.write('trans_install:\n')
for trans in transs:
file.write(f' {trans.key}: {trans.action}\n')
file.write('config:\n')
@@ -46,9 +46,9 @@ def fake_config(path, dotfiles, profile,
file.write(' actions:\n')
for action in dotfile.actions:
file.write(f' - {action.key}\n')
if dotfile.trans_r:
for trans in dotfile.trans_r:
file.write(f' trans_read: {trans.key}\n')
if dotfile.trans_install:
for trans in dotfile.trans_install:
file.write(f' trans_install: {trans.key}\n')
file.write('profiles:\n')
file.write(f' {profile}:\n')
file.write(' dotfiles:\n')
@@ -174,7 +174,7 @@ exec bspwm
fcontent9, _ = create_random_file(tmp, content=trans1)
dst9 = os.path.join(dst, get_string(6))
dotfile9 = Dotfile(get_string(6), dst9, os.path.basename(fcontent9),
trans_r=[the_trans])
trans_install=[the_trans])
# to test template
f10, _ = create_random_file(tmp, content='{{@@ header() @@}}')

View File

@@ -127,7 +127,7 @@ class TestImporter(unittest.TestCase):
path, _ = create_random_file(tmpdir)
imp = Importer('profile', None, '', '', {})
self.assertEqual(imp._apply_trans_w(path, trans), None)
self.assertEqual(imp._apply_trans_update(path, trans), None)
class TestActions(unittest.TestCase):

View File

@@ -121,7 +121,7 @@ class TestUpdate(unittest.TestCase):
# retrieve the path of the sub in the dotpath
d1indotpath = os.path.join(opt.dotpath, dotfile.src)
d1indotpath = os.path.expanduser(d1indotpath)
dotfile.trans_w = trans
dotfile.trans_update = trans
# update template
opt.update_path = [d3t]

View File

@@ -298,10 +298,10 @@ profiles:
},
'a_log_ed': 'echo 2',
},
'trans': {
'trans_install': {
't_log_ed': 'echo 3',
},
'trans_write': {
'trans_update': {
'tw_log_ed': 'echo 4',
},
'variables': {
@@ -335,10 +335,10 @@ profiles:
},
'a_log_ing': 'echo a',
},
'trans': {
'trans_install': {
't_log_ing': 'echo b',
},
'trans_write': {
'trans_update': {
'tw_log_ing': 'echo c',
},
'variables': {
@@ -406,8 +406,8 @@ profiles:
self.assert_is_subset(post_ed, post_ing)
# test transactions
self.assert_is_subset(imported_cfg.trans_r, importing_cfg.trans_r)
self.assert_is_subset(imported_cfg.trans_w, importing_cfg.trans_w)
self.assert_is_subset(imported_cfg.trans_install, importing_cfg.trans_install)
self.assert_is_subset(imported_cfg.trans_update, importing_cfg.trans_update)
# test variables
imported_vars = {
@@ -504,10 +504,10 @@ profiles:
},
'a_log': 'echo 2',
},
'trans': {
'trans_install': {
't_log': 'echo 3',
},
'trans_write': {
'trans_update': {
'tw_log': 'echo 4',
},
'variables': {
@@ -542,10 +542,10 @@ profiles:
},
'a_log': 'echo a',
},
'trans': {
'trans_install': {
't_log': 'echo b',
},
'trans_write': {
'trans_update': {
'tw_log': 'echo c',
},
'variables': {
@@ -605,12 +605,12 @@ profiles:
# test transactions
self.assertFalse(any(
imported_cfg.trans_r[key] == importing_cfg.trans_r[key]
for key in imported_cfg.trans_r
imported_cfg.trans_install[key] == importing_cfg.trans_install[key]
for key in imported_cfg.trans_install
))
self.assertFalse(any(
imported_cfg.trans_w[key] == importing_cfg.trans_w[key]
for key in imported_cfg.trans_w
imported_cfg.trans_update[key] == importing_cfg.trans_update[key]
for key in imported_cfg.trans_update
))
# test variables