1
0
mirror of https://github.com/deadc0de6/dotdrop.git synced 2026-02-04 17:24:46 +00:00

Merge branch 'master' into clear-on-install

This commit is contained in:
deadc0de
2023-10-22 14:46:02 +02:00
committed by GitHub
34 changed files with 1311 additions and 328 deletions

View File

@@ -4,7 +4,7 @@ The **config** entry (mandatory) contains global settings.
Entry | Description | Default Entry | Description | Default
-------- | ------------- | ------------ -------- | ------------- | ------------
`backup` | Create a backup of the dotfile in case it differs from the one that will be installed by dotdrop | true `backup` | Create a backup of the existing destination; see [backup entry](config-config.md#backup-entry)) | true
`banner` | Display the banner | true `banner` | Display the banner | true
`check_version` | Check if a new version of dotdrop is available on github | false `check_version` | Check if a new version of dotdrop is available on github | false
`chmod_on_import` | Always add a chmod entry on newly imported dotfiles (see `--preserve-mode`) | false `chmod_on_import` | Always add a chmod entry on newly imported dotfiles (see `--preserve-mode`) | false
@@ -212,4 +212,16 @@ profiles:
hostname: hostname:
dotfiles: dotfiles:
- f_vimrc - f_vimrc
``` ```
## backup entry
When set to `true`, existing files that would be replaced
by a dotdrop `install`, are backed up with the
extension `.dotdropbak` if their content differ.
Note:
* directories will **not** be backed up, only files
* when using a different `link` value than `nolink` with directories,
the files under the directory will **not** be backed up
(See [Symlinking dotfiles](config-file.md#symlinking-dotfiles)),

View File

@@ -14,11 +14,13 @@ Entry | Description
`ignoreempty` | If true, an empty template will not be deployed (defaults to the value of `ignoreempty`) `ignoreempty` | If true, an empty template will not be deployed (defaults to the value of `ignoreempty`)
`instignore` | List of patterns to ignore when installing (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns)) `instignore` | List of patterns to ignore when installing (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns))
`template` | If false, disable templating for this dotfile (defaults to the value of `template_dotfile_default`) `template` | If false, disable templating for this dotfile (defaults to the value of `template_dotfile_default`)
`trans_read` | Transformation key to apply when installing this dotfile (must be defined in the **trans_read** entry below; see [transformations](config-transformations.md)) `trans_install` | Transformation key to apply when installing this dotfile (must be defined in the **trans_install** entry below; see [transformations](config-transformations.md))
`trans_write` | Transformation key to apply when updating this dotfile (must be defined in the **trans_write** entry below; see [transformations](config-transformations.md)) `trans_update` | Transformation key to apply when updating this dotfile (must be defined in the **trans_update** entry below; see [transformations](config-transformations.md))
`upignore` | List of patterns to ignore when updating (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns)) `upignore` | List of patterns to ignore when updating (enclose in quotes when using wildcards; see [ignore patterns](config-file.md#ignore-patterns))
<s>link_children</s> | Replaced by `link: link_children` <s>link_children</s> | Replaced by `link: link_children`
<s>trans</s> | Replaced by `trans_read` <s>trans</s> | Replaced by `trans_install`
<s>trans_read</s> | Replaced by `trans_install`
<s>trans_write</s> | Replaced by `trans_update`
```yaml ```yaml
<dotfile-key-name>: <dotfile-key-name>:
@@ -37,8 +39,8 @@ Entry | Description
- <action-key> - <action-key>
template: (true|false) template: (true|false)
chmod: '<file-permissions>' chmod: '<file-permissions>'
trans_read: <transformation-key> trans_install: <transformation-key>
trans_write: <transformation-key> trans_update: <transformation-key>
``` ```
## Dotfile actions ## Dotfile actions

View File

@@ -91,17 +91,17 @@ dotfiles:
dst: ~/dir dst: ~/dir
chmod: 744 chmod: 744
f_preserve: f_preserve:
src: preserve src: pfile
dst: ~/preserve dst: ~/pfile
chmod: preserve chmod: preserve
``` ```
The `chmod` value defines the file permissions in octal notation to apply on dotfiles. If undefined The `chmod` value defines the file permissions in octal notation to apply to the dotfile. If undefined
new files will get the system default permissions (see `umask`, `777-<umask>` for directories and new files will get the system default permissions (see `umask`, `777-<umask>` for directories and
`666-<umask>` for files). `666-<umask>` for files).
The special keyword `preserve` allows to ensure that if the dotfiles already exists The special keyword `preserve` allows to ensure that if the dotfiles already exists
on the filesystem, it is not altered during `install` and the `chmod` value won't on the filesystem, its permission is not altered during `install` and the `chmod` config value won't
be changed during `update`. be changed during `update`.
On `import`, the following rules are applied: On `import`, the following rules are applied:

View File

@@ -14,14 +14,14 @@ For examples of transformation uses, see:
There are two types of transformations available: There are two types of transformations available:
* **Read transformations**: used to transform dotfiles before they are installed ([config](config-config.md) key `trans_read`) * **Install transformations**: used to transform dotfiles before they are installed ([config](config-config.md) key `trans_install`)
* Used for commands `install` and `compare` * Used for commands `install` and `compare`
* They have two mandatory arguments: * They have two mandatory arguments:
* **{0}** will be replaced with the dotfile to process * **{0}** will be replaced with the dotfile to process
* **{1}** will be replaced with a temporary file to store the result of the transformation * **{1}** will be replaced with a temporary file to store the result of the transformation
* This Happens **before** the dotfile is templated (see [templating](../template/templating.md)) * This Happens **before** the dotfile is templated (see [templating](../template/templating.md))
* **Write transformations**: used to transform files before updating a dotfile ([config](config-config.md) key `trans_write`) * **Update/Import transformations**: used to transform files before updating/importing a dotfile ([config](config-config.md) key `trans_update`)
* Used for command `update` and `import` * Used for command `update` and `import`
* They have two mandatory arguments: * They have two mandatory arguments:
* **{0}** will be replaced with the file path to update the dotfile with * **{0}** will be replaced with the file path to update the dotfile with
@@ -36,13 +36,13 @@ Transformations also support additional positional arguments that must start fro
For example: For example:
```yaml ```yaml
trans_read: trans_install:
targ: echo "$(basename {0}); {{@@ _dotfile_key @@}}; {2}; {3}" > {1} targ: echo "$(basename {0}); {{@@ _dotfile_key @@}}; {2}; {3}" > {1}
dotfiles: dotfiles:
f_abc: f_abc:
dst: /tmp/abc dst: /tmp/abc
src: abc src: abc
trans_read: targ "{{@@ profile @@}}" lastarg trans_install: targ "{{@@ profile @@}}" lastarg
profiles: profiles:
p1: p1:
dotfiles: dotfiles:
@@ -51,21 +51,21 @@ profiles:
will result in `abc; f_abc; p1; lastarg`. will result in `abc; f_abc; p1; lastarg`.
## trans_read entry ## trans_install entry
The **trans_read** entry (optional) contains a transformations mapping (See [transformations](config-transformations.md)). The **trans_install** entry (optional) contains a transformations mapping (See [transformations](config-transformations.md)).
```yaml ```yaml
trans_read: trans_install:
<trans-key>: <command-to-execute> <trans-key>: <command-to-execute>
``` ```
## trans_write entry ## trans_update entry
The **trans_write** entry (optional) contains a write transformations mapping (See [transformations](config-transformations.md)). The **trans_update** entry (optional) contains a write transformations mapping (See [transformations](config-transformations.md)).
```yaml ```yaml
trans_write: trans_update:
<trans-key>: <command-to-execute> <trans-key>: <command-to-execute>
``` ```
@@ -77,10 +77,10 @@ and [template variables](../template/template-variables.md#template-variables)).
A very dumb example: A very dumb example:
```yaml ```yaml
trans_read: trans_install:
r_echo_abs_src: echo "{0}: {{@@ _dotfile_abs_src @@}}" > {1} r_echo_abs_src: echo "{0}: {{@@ _dotfile_abs_src @@}}" > {1}
r_echo_var: echo "{0}: {{@@ r_var @@}}" > {1} r_echo_var: echo "{0}: {{@@ r_var @@}}" > {1}
trans_write: trans_update:
w_echo_key: echo "{0}: {{@@ _dotfile_key @@}}" > {1} w_echo_key: echo "{0}: {{@@ _dotfile_key @@}}" > {1}
w_echo_var: echo "{0}: {{@@ w_var @@}}" > {1} w_echo_var: echo "{0}: {{@@ w_var @@}}" > {1}
variables: variables:
@@ -90,11 +90,11 @@ dotfiles:
f_abc: f_abc:
dst: ${tmpd}/abc dst: ${tmpd}/abc
src: abc src: abc
trans_read: r_echo_abs_src trans_install: r_echo_abs_src
trans_write: w_echo_key trans_update: w_echo_key
f_def: f_def:
dst: ${tmpd}/def dst: ${tmpd}/def
src: def src: def
trans_read: r_echo_var trans_install: r_echo_var
trans_write: w_echo_var trans_update: w_echo_var
``` ```

View File

@@ -37,9 +37,9 @@ First you need to define the encryption/decryption methods, for example
```yaml ```yaml
variables: variables:
keyid: "11223344" keyid: "11223344"
trans_read: trans_install:
_decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}" _decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}" _encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}"
``` ```
@@ -60,17 +60,17 @@ Using GPG keys:
```yaml ```yaml
variables: variables:
keyid: "11223344" keyid: "11223344"
trans_read: trans_install:
_decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}" _decrypt: "gpg -q --for-your-eyes-only--no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}" _encrypt: "gpg -q -r {{@@ keyid @@}} --armor --no-tty -o {1} -e {0}"
``` ```
Passphrase is stored in an environment variable: Passphrase is stored in an environment variable:
```yaml ```yaml
trans_read: trans_install:
_decrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}" _decrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}" _encrypt: "echo {{@@ env['THE_KEY'] @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
``` ```
@@ -78,9 +78,9 @@ Passphrase is stored as a variable:
```yaml ```yaml
variables: variables:
gpg_password: "some password" gpg_password: "some password"
trans_read: trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}" _decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}" _encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
``` ```
@@ -88,9 +88,9 @@ Passphrase is retrieved using a script:
```yaml ```yaml
dynvariables: dynvariables:
gpg_password: "./get-password.sh" gpg_password: "./get-password.sh"
trans_read: trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}" _decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}" _encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
``` ```
@@ -100,9 +100,9 @@ variables:
gpg_password_file: "/tmp/the-password" gpg_password_file: "/tmp/the-password"
dynvariables: dynvariables:
gpg_password: "cat {{@@ gpg_password_file @@}}" gpg_password: "cat {{@@ gpg_password_file @@}}"
trans_read: trans_install:
_decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}" _decrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --for-your-eyes-only --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write: trans_update:
_encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}" _encrypt: "echo {{@@ gpg_password @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
``` ```

View File

@@ -1,13 +1,13 @@
# Handle compressed directories # Handle compressed directories
This is an example of how to use transformations (`trans_read` and `trans_write`) to store This is an example of how to use transformations (`trans_install` and `trans_update`) to store
compressed directories and deploy them with dotdrop. compressed directories and deploy them with dotdrop.
Start by defining the transformations: Start by defining the transformations:
```yaml ```yaml
trans_read: trans_install:
uncompress: "mkdir -p {1} && tar -xf {0} -C {1}" uncompress: "mkdir -p {1} && tar -xf {0} -C {1}"
trans_write: trans_update:
compress: "tar -cf {1} -C {0} ." compress: "tar -cf {1} -C {0} ."
``` ```

15
docs/usage.md vendored
View File

@@ -235,6 +235,21 @@ dotdrop. It will:
For more options, see the usage with `dotdrop --help`. For more options, see the usage with `dotdrop --help`.
## Uninstall dotfiles
The `uninstall` command removes dotfiles installed by dotdrop
```bash
$ dotdrop uninstall
```
It will remove the installed dotfiles related to the provided key
(or all dotfiles if not provided) of the selected profile.
If a backup exists ([backup entry](config/config-config.md#backup-entry)),
the file will be restored.
For more options, see the usage with `dotdrop --help`.
## Concurrency ## Concurrency
The command line switch `-w`/`--workers`, if set to a value greater than one, enables the use The command line switch `-w`/`--workers`, if set to a value greater than one, enables the use

View File

@@ -69,23 +69,23 @@ class CfgAggregator:
return self.cfgyaml.del_dotfile_from_profile(dotfile.key, profile.key) return self.cfgyaml.del_dotfile_from_profile(dotfile.key, profile.key)
def new_dotfile(self, src, dst, link, chmod=None, def new_dotfile(self, src, dst, link, chmod=None,
trans_read=None, trans_write=None): trans_install=None, trans_update=None):
""" """
import a new dotfile import a new dotfile
@src: path in dotpath @src: path in dotpath
@dst: path in FS @dst: path in FS
@link: LinkType @link: LinkType
@chmod: file permission @chmod: file permission
@trans_read: read transformation @trans_install: read transformation
@trans_write: write transformation @trans_update: write transformation
""" """
dst = self.path_to_dotfile_dst(dst) dst = self.path_to_dotfile_dst(dst)
dotfile = self.get_dotfile_by_src_dst(src, dst) dotfile = self.get_dotfile_by_src_dst(src, dst)
if not dotfile: if not dotfile:
# add the dotfile # add the dotfile
dotfile = self._create_new_dotfile(src, dst, link, chmod=chmod, dotfile = self._create_new_dotfile(src, dst, link, chmod=chmod,
trans_read=trans_read, trans_install=trans_install,
trans_write=trans_write) trans_update=trans_update)
if not dotfile: if not dotfile:
return False return False
@@ -237,25 +237,25 @@ class CfgAggregator:
######################################################## ########################################################
def _create_new_dotfile(self, src, dst, link, chmod=None, def _create_new_dotfile(self, src, dst, link, chmod=None,
trans_read=None, trans_write=None): trans_install=None, trans_update=None):
"""create a new dotfile""" """create a new dotfile"""
# get a new dotfile with a unique key # get a new dotfile with a unique key
key = self._get_new_dotfile_key(dst) key = self._get_new_dotfile_key(dst)
self.log.dbg(f'new dotfile key: {key}') self.log.dbg(f'new dotfile key: {key}')
# add the dotfile # add the dotfile
trans_r_key = trans_w_key = None trans_install_key = trans_update_key = None
if trans_read: if trans_install:
trans_r_key = trans_read.key trans_install_key = trans_install.key
if trans_write: if trans_update:
trans_w_key = trans_write.key trans_update_key = trans_update.key
if not self.cfgyaml.add_dotfile(key, src, dst, link, if not self.cfgyaml.add_dotfile(key, src, dst, link,
chmod=chmod, chmod=chmod,
trans_r_key=trans_r_key, trans_install_key=trans_install_key,
trans_w_key=trans_w_key): trans_update_key=trans_update_key):
return None return None
return Dotfile(key, dst, src, return Dotfile(key, dst, src,
trans_r=trans_read, trans_install=trans_install,
trans_w=trans_write) trans_update=trans_update)
######################################################## ########################################################
# parsing # parsing
@@ -297,15 +297,15 @@ class CfgAggregator:
self.actions = Action.parse_dict(self.cfgyaml.actions) self.actions = Action.parse_dict(self.cfgyaml.actions)
debug_list('actions', self.actions, self.debug) debug_list('actions', self.actions, self.debug)
# trans_r # trans_install
self.log.dbg('parsing trans_r') self.log.dbg('parsing trans_install')
self.trans_r = Transform.parse_dict(self.cfgyaml.trans_r) self.trans_install = Transform.parse_dict(self.cfgyaml.trans_install)
debug_list('trans_r', self.trans_r, self.debug) debug_list('trans_install', self.trans_install, self.debug)
# trans_w # trans_update
self.log.dbg('parsing trans_w') self.log.dbg('parsing trans_update')
self.trans_w = Transform.parse_dict(self.cfgyaml.trans_w) self.trans_update = Transform.parse_dict(self.cfgyaml.trans_update)
debug_list('trans_w', self.trans_w, self.debug) debug_list('trans_update', self.trans_update, self.debug)
# variables # variables
self.log.dbg('parsing variables') self.log.dbg('parsing variables')
@@ -334,14 +334,17 @@ class CfgAggregator:
msg = f'default actions: {self.settings.default_actions}' msg = f'default actions: {self.settings.default_actions}'
self.log.dbg(msg) self.log.dbg(msg)
# patch trans_w/trans_r in dotfiles # patch trans_install in dotfiles
trans_inst_args = self._get_trans_update_args(self.get_trans_install)
self._patch_keys_to_objs(self.dotfiles, self._patch_keys_to_objs(self.dotfiles,
"trans_r", CfgYaml.key_trans_install,
self._get_trans_w_args(self.get_trans_r), trans_inst_args,
islist=False) islist=False)
# patch trans_update in dotfiles
trans_update_args = self._get_trans_update_args(self.get_trans_update)
self._patch_keys_to_objs(self.dotfiles, self._patch_keys_to_objs(self.dotfiles,
"trans_w", CfgYaml.key_trans_update,
self._get_trans_w_args(self.get_trans_w), trans_update_args,
islist=False) islist=False)
self.log.dbg('done parsing cfgyaml into cfg_aggregator') self.log.dbg('done parsing cfgyaml into cfg_aggregator')
@@ -542,7 +545,7 @@ class CfgAggregator:
action = self._get_action(key) action = self._get_action(key)
return action return action
def _get_trans_w_args(self, getter): def _get_trans_update_args(self, getter):
"""return transformation by key with the arguments""" """return transformation by key with the arguments"""
def getit(key): def getit(key):
fields = shlex.split(key) fields = shlex.split(key)
@@ -557,16 +560,16 @@ class CfgAggregator:
return trans return trans
return getit return getit
def get_trans_r(self, key): def get_trans_install(self, key):
"""return the trans_r with this key""" """return the trans_install with this key"""
try: try:
return next(x for x in self.trans_r if x.key == key) return next(x for x in self.trans_install if x.key == key)
except StopIteration: except StopIteration:
return None return None
def get_trans_w(self, key): def get_trans_update(self, key):
"""return the trans_w with this key""" """return the trans_update with this key"""
try: try:
return next(x for x in self.trans_w if x.key == key) return next(x for x in self.trans_update if x.key == key)
except StopIteration: except StopIteration:
return None return None

View File

@@ -11,8 +11,8 @@ the upper layer:
* self.dotfiles * self.dotfiles
* self.profiles * self.profiles
* self.actions * self.actions
* self.trans_r * self.trans_install
* self.trans_w * self.trans_update
* self.variables * self.variables
Additionally a few methods are exported. Additionally a few methods are exported.
@@ -50,9 +50,11 @@ class CfgYaml:
key_dotfiles = 'dotfiles' key_dotfiles = 'dotfiles'
key_profiles = 'profiles' key_profiles = 'profiles'
key_actions = 'actions' key_actions = 'actions'
old_key_trans_r = 'trans' old_key_trans = 'trans'
key_trans_r = 'trans_read' old_key_trans_r = 'trans_read'
key_trans_w = 'trans_write' old_key_trans_w = 'trans_write'
key_trans_install = 'trans_install'
key_trans_update = 'trans_update'
key_variables = 'variables' key_variables = 'variables'
key_dvariables = 'dynvariables' key_dvariables = 'dynvariables'
key_uvariables = 'uservariables' key_uvariables = 'uservariables'
@@ -146,8 +148,8 @@ class CfgYaml:
self.dotfiles = {} self.dotfiles = {}
self.profiles = {} self.profiles = {}
self.actions = {} self.actions = {}
self.trans_r = {} self.trans_install = {}
self.trans_w = {} self.trans_update = {}
self.variables = {} self.variables = {}
if not os.path.exists(self._path): if not os.path.exists(self._path):
@@ -248,10 +250,10 @@ class CfgYaml:
self.dotfiles = self._parse_blk_dotfiles(self._yaml_dict) self.dotfiles = self._parse_blk_dotfiles(self._yaml_dict)
# parse the "actions" block # parse the "actions" block
self.actions = self._parse_blk_actions(self._yaml_dict) self.actions = self._parse_blk_actions(self._yaml_dict)
# parse the "trans_r" block # parse the "trans_install" block
self.trans_r = self._parse_blk_trans_r(self._yaml_dict) self.trans_install = self._parse_blk_trans_install(self._yaml_dict)
# parse the "trans_w" block # parse the "trans_update" block
self.trans_w = self._parse_blk_trans_w(self._yaml_dict) self.trans_update = self._parse_blk_trans_update(self._yaml_dict)
################################################## ##################################################
# import elements # import elements
@@ -427,7 +429,7 @@ class CfgYaml:
return True return True
def add_dotfile(self, key, src, dst, link, chmod=None, def add_dotfile(self, key, src, dst, link, chmod=None,
trans_r_key=None, trans_w_key=None): trans_install_key=None, trans_update_key=None):
"""add a new dotfile""" """add a new dotfile"""
if key in self.dotfiles.keys(): if key in self.dotfiles.keys():
return False return False
@@ -438,8 +440,8 @@ class CfgYaml:
self._dbg(f'new dotfile link: {link}') self._dbg(f'new dotfile link: {link}')
if chmod: if chmod:
self._dbg(f'new dotfile chmod: {chmod:o}') self._dbg(f'new dotfile chmod: {chmod:o}')
self._dbg(f'new dotfile trans_r: {trans_r_key}') self._dbg(f'new dotfile trans_install: {trans_install_key}')
self._dbg(f'new dotfile trans_w: {trans_w_key}') self._dbg(f'new dotfile trans_update: {trans_update_key}')
# create the dotfile dict # create the dotfile dict
df_dict = { df_dict = {
@@ -456,11 +458,11 @@ class CfgYaml:
if chmod: if chmod:
df_dict[self.key_dotfile_chmod] = str(format(chmod, 'o')) df_dict[self.key_dotfile_chmod] = str(format(chmod, 'o'))
# trans_r/trans_w # trans_install/trans_update
if trans_r_key: if trans_install_key:
df_dict[self.key_trans_r] = str(trans_r_key) df_dict[self.key_trans_install] = str(trans_install_key)
if trans_w_key: if trans_update_key:
df_dict[self.key_trans_w] = str(trans_w_key) df_dict[self.key_trans_update] = str(trans_update_key)
if self._debug: if self._debug:
self._dbg(f'dotfile dict: {df_dict}') self._dbg(f'dotfile dict: {df_dict}')
@@ -618,30 +620,25 @@ class CfgYaml:
self._debug_dict('actions block', actions) self._debug_dict('actions block', actions)
return actions return actions
def _parse_blk_trans_r(self, dic): def _parse_blk_trans_install(self, dic):
"""parse the "trans_r" block""" """parse the "trans_install" block"""
key = self.key_trans_r trans_install = self._get_entry(dic, self.key_trans_install,
if self.old_key_trans_r in dic: mandatory=False)
msg = '\"trans\" is deprecated, please use \"trans_read\"' if trans_install:
self._log.warn(msg) trans_install = trans_install.copy()
dic[self.key_trans_r] = dic[self.old_key_trans_r]
del dic[self.old_key_trans_r]
trans_r = self._get_entry(dic, key, mandatory=False)
if trans_r:
trans_r = trans_r.copy()
if self._debug: if self._debug:
self._debug_dict('trans_r block', trans_r) self._debug_dict('trans_install block', trans_install)
return trans_r return trans_install
def _parse_blk_trans_w(self, dic): def _parse_blk_trans_update(self, dic):
"""parse the "trans_w" block""" """parse the "trans_update" block"""
trans_w = self._get_entry(dic, self.key_trans_w, trans_update = self._get_entry(dic, self.key_trans_update,
mandatory=False) mandatory=False)
if trans_w: if trans_update:
trans_w = trans_w.copy() trans_update = trans_update.copy()
if self._debug: if self._debug:
self._debug_dict('trans_w block', trans_w) self._debug_dict('trans_update block', trans_update)
return trans_w return trans_update
def _parse_blk_variables(self, dic): def _parse_blk_variables(self, dic):
"""parse the "variables" block""" """parse the "variables" block"""
@@ -817,6 +814,7 @@ class CfgYaml:
if not dotfiles: if not dotfiles:
return dotfiles return dotfiles
new = {} new = {}
for k, val in dotfiles.items(): for k, val in dotfiles.items():
if self.key_dotfile_src not in val: if self.key_dotfile_src not in val:
# add 'src' as key' if not present # add 'src' as key' if not present
@@ -825,14 +823,6 @@ class CfgYaml:
else: else:
new[k] = val new[k] = val
if self.old_key_trans_r in val:
# fix deprecated trans key
msg = f'{k} \"trans\" is deprecated, please use \"trans_read\"'
self._log.warn(msg)
val[self.key_trans_r] = val[self.old_key_trans_r]
del val[self.old_key_trans_r]
new[k] = val
if self.key_dotfile_link not in val: if self.key_dotfile_link not in val:
# apply link value if undefined # apply link value if undefined
value = self.settings[self.key_settings_link_dotfile_default] value = self.settings[self.key_settings_link_dotfile_default]
@@ -1108,8 +1098,10 @@ class CfgYaml:
self.profiles = self._merge_dict(self.profiles, sub.profiles, self.profiles = self._merge_dict(self.profiles, sub.profiles,
deep=True) deep=True)
self.actions = self._merge_dict(self.actions, sub.actions) self.actions = self._merge_dict(self.actions, sub.actions)
self.trans_r = self._merge_dict(self.trans_r, sub.trans_r) self.trans_install = self._merge_dict(self.trans_install,
self.trans_w = self._merge_dict(self.trans_w, sub.trans_w) sub.trans_install)
self.trans_update = self._merge_dict(self.trans_update,
sub.trans_update)
self._clear_profile_vars(sub.variables) self._clear_profile_vars(sub.variables)
self.imported_configs.append(path) self.imported_configs.append(path)
@@ -1189,6 +1181,54 @@ class CfgYaml:
return return
self._fix_deprecated_link_by_default(yamldict) self._fix_deprecated_link_by_default(yamldict)
self._fix_deprecated_dotfile_link(yamldict) self._fix_deprecated_dotfile_link(yamldict)
self._fix_deprecated_trans(yamldict)
def _fix_deprecated_trans_in_dict(self, yamldic):
# trans -> trans_install
old_key = self.old_key_trans
new_key = self.key_trans_install
if old_key in yamldic:
yamldic[old_key] = yamldic[new_key]
del yamldic[old_key]
msg = f'deprecated \"{old_key}\", '
msg += f', updated to {new_key}\"'
self._log.warn(msg)
self._dirty = True
self._dirty_deprecated = True
# trans_read -> trans_install
old_key = self.old_key_trans_r
new_key = self.key_trans_install
if old_key in yamldic:
yamldic[new_key] = yamldic[old_key]
del yamldic[old_key]
warn = f'deprecated \"{old_key}\"'
warn += f', updated to \"{new_key}\"'
self._log.warn(warn)
self._dirty = True
self._dirty_deprecated = True
# trans_write -> trans_update
old_key = self.old_key_trans_w
new_key = self.key_trans_update
if old_key in yamldic:
yamldic[new_key] = yamldic[old_key]
del yamldic[old_key]
warn = f'deprecated \"{old_key}\"'
warn += f', updated to \"{new_key}\"'
self._log.warn(warn)
self._dirty = True
self._dirty_deprecated = True
def _fix_deprecated_trans(self, yamldict):
"""fix deprecated trans key"""
# top ones
self._fix_deprecated_trans_in_dict(yamldict)
# dotfiles ones
if self.key_dotfiles in yamldict and yamldict[self.key_dotfiles]:
config = yamldict[self.key_dotfiles]
for _, val in config.items():
self._fix_deprecated_trans_in_dict(val)
def _fix_deprecated_link_by_default(self, yamldict): def _fix_deprecated_link_by_default(self, yamldict):
"""fix deprecated link_by_default""" """fix deprecated link_by_default"""
@@ -1786,8 +1826,8 @@ class CfgYaml:
self._debug_dict('entry dotfiles', self.dotfiles) self._debug_dict('entry dotfiles', self.dotfiles)
self._debug_dict('entry profiles', self.profiles) self._debug_dict('entry profiles', self.profiles)
self._debug_dict('entry actions', self.actions) self._debug_dict('entry actions', self.actions)
self._debug_dict('entry trans_r', self.trans_r) self._debug_dict('entry trans_install', self.trans_install)
self._debug_dict('entry trans_w', self.trans_w) self._debug_dict('entry trans_update', self.trans_update)
self._debug_dict('entry variables', self.variables) self._debug_dict('entry variables', self.variables)
def _debug_dict(self, title, elems): def _debug_dict(self, title, elems):

View File

@@ -16,6 +16,7 @@ from dotdrop.options import Options
from dotdrop.logger import Logger from dotdrop.logger import Logger
from dotdrop.templategen import Templategen from dotdrop.templategen import Templategen
from dotdrop.installer import Installer from dotdrop.installer import Installer
from dotdrop.uninstaller import Uninstaller
from dotdrop.updater import Updater from dotdrop.updater import Updater
from dotdrop.comparator import Comparator from dotdrop.comparator import Comparator
from dotdrop.importer import Importer from dotdrop.importer import Importer
@@ -120,9 +121,10 @@ def _dotfile_compare(opts, dotfile, tmp):
# apply transformation # apply transformation
tmpsrc = None tmpsrc = None
if dotfile.trans_r: if dotfile.trans_install:
LOG.dbg('applying transformation before comparing') LOG.dbg('applying transformation before comparing')
tmpsrc = apply_trans(opts.dotpath, dotfile, templ, debug=opts.debug) tmpsrc = apply_install_trans(opts.dotpath, dotfile,
templ, debug=opts.debug)
if not tmpsrc: if not tmpsrc:
# could not apply trans # could not apply trans
return False return False
@@ -238,8 +240,9 @@ def _dotfile_install(opts, dotfile, tmpdir=None):
# nolink # nolink
src = dotfile.src src = dotfile.src
tmp = None tmp = None
if dotfile.trans_r: if dotfile.trans_install:
tmp = apply_trans(opts.dotpath, dotfile, templ, debug=opts.debug) tmp = apply_install_trans(opts.dotpath, dotfile,
templ, debug=opts.debug)
if not tmp: if not tmp:
return False, dotfile.key, None return False, dotfile.key, None
src = tmp src = tmp
@@ -538,8 +541,8 @@ def cmd_importer(opts):
import_as=opts.import_as, import_as=opts.import_as,
import_link=opts.import_link, import_link=opts.import_link,
import_mode=opts.import_mode, import_mode=opts.import_mode,
import_transw=opts.import_transw, trans_install=opts.import_trans_install,
import_transr=opts.import_transr) trans_update=opts.import_trans_update)
if tmpret < 0: if tmpret < 0:
ret = False ret = False
elif tmpret > 0: elif tmpret > 0:
@@ -618,6 +621,47 @@ def cmd_detail(opts):
LOG.log('') LOG.log('')
def cmd_uninstall(opts):
"""uninstall"""
dotfiles = opts.dotfiles
keys = opts.uninstall_key
if keys:
# update only specific keys for this profile
dotfiles = []
for key in uniq_list(keys):
dotfile = opts.conf.get_dotfile(key)
if dotfile:
dotfiles.append(dotfile)
if not dotfiles:
msg = f'no dotfile to uninstall for this profile (\"{opts.profile}\")'
LOG.warn(msg)
return False
if opts.debug:
lfs = [k.key for k in dotfiles]
LOG.dbg(f'dotfiles registered for uninstall: {lfs}')
uninst = Uninstaller(base=opts.dotpath,
workdir=opts.workdir,
dry=opts.dry,
safe=opts.safe,
debug=opts.debug,
backup_suffix=opts.install_backup_suffix)
uninstalled = 0
for dotf in dotfiles:
res, msg = uninst.uninstall(dotf.src,
dotf.dst,
dotf.link)
if not res:
LOG.err(msg)
continue
uninstalled += 1
LOG.log(f'\n{uninstalled} dotfile(s) uninstalled.')
return True
def cmd_remove(opts): def cmd_remove(opts):
"""remove dotfile from dotpath and from config""" """remove dotfile from dotpath and from config"""
paths = opts.remove_path paths = opts.remove_path
@@ -773,19 +817,20 @@ def _select(selections, dotfiles):
return selected return selected
def apply_trans(dotpath, dotfile, templater, debug=False): def apply_install_trans(dotpath, dotfile, templater, debug=False):
""" """
apply the read transformation to the dotfile apply the install transformation to the dotfile
return None if fails and new source if succeed return None if fails and new source if succeed
""" """
src = dotfile.src src = dotfile.src
new_src = f'{src}.{TRANS_SUFFIX}' new_src = f'{src}.{TRANS_SUFFIX}'
trans = dotfile.trans_r trans = dotfile.trans_install
LOG.dbg(f'executing transformation: {trans}') LOG.dbg(f'executing install transformation: {trans}')
srcpath = os.path.join(dotpath, src) srcpath = os.path.join(dotpath, src)
temp = os.path.join(dotpath, new_src) temp = os.path.join(dotpath, new_src)
if not trans.transform(srcpath, temp, templater=templater, debug=debug): if not trans.transform(srcpath, temp, templater=templater, debug=debug):
msg = f'transformation \"{trans.key}\" failed for {dotfile.key}' msg = f'install transformation \"{trans.key}\"'
msg += f'failed for {dotfile.key}'
LOG.err(msg) LOG.err(msg)
if new_src and os.path.exists(new_src): if new_src and os.path.exists(new_src):
removepath(new_src, LOG) removepath(new_src, LOG)
@@ -854,6 +899,12 @@ def _exec_command(opts):
LOG.dbg(f'running cmd: {command}') LOG.dbg(f'running cmd: {command}')
cmd_remove(opts) cmd_remove(opts)
elif opts.cmd_uninstall:
# uninstall dotfile
command = 'uninstall'
LOG.dbg(f'running cmd: {command}')
cmd_uninstall(opts)
except UndefinedException as exc: except UndefinedException as exc:
LOG.err(exc) LOG.err(exc)
ret = False ret = False

View File

@@ -14,12 +14,12 @@ class Dotfile(DictParser):
"""Represent a dotfile.""" """Represent a dotfile."""
# dotfile keys # dotfile keys
key_noempty = 'ignoreempty' key_noempty = 'ignoreempty'
key_trans_r = 'trans_read' key_trans_install = 'trans_install'
key_trans_w = 'trans_write' key_trans_update = 'trans_update'
key_template = 'template' key_template = 'template'
def __init__(self, key, dst, src, def __init__(self, key, dst, src,
actions=None, trans_r=None, trans_w=None, actions=None, trans_install=None, trans_update=None,
link=LinkTypes.NOLINK, noempty=False, link=LinkTypes.NOLINK, noempty=False,
cmpignore=None, upignore=None, cmpignore=None, upignore=None,
instignore=None, template=True, chmod=None, instignore=None, template=True, chmod=None,
@@ -30,8 +30,8 @@ class Dotfile(DictParser):
@dst: dotfile dst (in user's home usually) @dst: dotfile dst (in user's home usually)
@src: dotfile src (in dotpath) @src: dotfile src (in dotpath)
@actions: dictionary of actions to execute for this dotfile @actions: dictionary of actions to execute for this dotfile
@trans_r: transformation to change dotfile before it is installed @trans_install: transformation to change dotfile before it is installed
@trans_w: transformation to change dotfile before updating it @trans_update: transformation to change dotfile before updating it
@link: link behavior @link: link behavior
@noempty: ignore empty template if True @noempty: ignore empty template if True
@upignore: patterns to ignore when updating @upignore: patterns to ignore when updating
@@ -46,8 +46,8 @@ class Dotfile(DictParser):
self.link = LinkTypes.get(link) self.link = LinkTypes.get(link)
self.noempty = noempty self.noempty = noempty
self.src = src self.src = src
self.trans_r = trans_r self.trans_install = trans_install
self.trans_w = trans_w self.trans_update = trans_update
self.upignore = upignore or [] self.upignore = upignore or []
self.cmpignore = cmpignore or [] self.cmpignore = cmpignore or []
self.instignore = instignore or [] self.instignore = instignore or []
@@ -57,14 +57,14 @@ class Dotfile(DictParser):
if self.link != LinkTypes.NOLINK and \ if self.link != LinkTypes.NOLINK and \
( (
(trans_r and len(trans_r) > 0) or (trans_install and len(trans_install) > 0) or
(trans_w and len(trans_w) > 0) (trans_update and len(trans_update) > 0)
): ):
msg = f'[{key}] transformations disabled' msg = f'[{key}] transformations disabled'
msg += ' because dotfile is linked' msg += ' because dotfile is linked'
self.log.warn(msg) self.log.warn(msg)
self.trans_r = [] self.trans_install = []
self.trans_w = [] self.trans_update = []
def get_dotfile_variables(self): def get_dotfile_variables(self):
"""return this dotfile specific variables""" """return this dotfile specific variables"""
@@ -83,25 +83,21 @@ class Dotfile(DictParser):
"""return all 'post' actions""" """return all 'post' actions"""
return [a for a in self.actions if a.kind == Action.post] return [a for a in self.actions if a.kind == Action.post]
def get_trans_r(self): def get_trans_install(self):
"""return trans_r object""" """return trans_install object"""
return self.trans_r return self.trans_install
def get_trans_w(self): def get_trans_update(self):
"""return trans_w object""" """return trans_update object"""
return self.trans_w return self.trans_update
@classmethod @classmethod
def _adjust_yaml_keys(cls, value): def _adjust_yaml_keys(cls, value):
"""patch dict""" """patch dict"""
value['noempty'] = value.get(cls.key_noempty, False) value['noempty'] = value.get(cls.key_noempty, False)
value['trans_r'] = value.get(cls.key_trans_r)
value['trans_w'] = value.get(cls.key_trans_w)
value['template'] = value.get(cls.key_template, True) value['template'] = value.get(cls.key_template, True)
# remove old entries # remove old entries
value.pop(cls.key_noempty, None) value.pop(cls.key_noempty, None)
value.pop(cls.key_trans_r, None)
value.pop(cls.key_trans_w, None)
return value return value
def __eq__(self, other): def __eq__(self, other):
@@ -116,6 +112,10 @@ class Dotfile(DictParser):
msg += f', dst:\"{self.dst}\"' msg += f', dst:\"{self.dst}\"'
msg += f', link:\"{self.link}\"' msg += f', link:\"{self.link}\"'
msg += f', template:{self.template}' msg += f', template:{self.template}'
if self.trans_install:
msg += f', trans_install:{self.trans_install}'
if self.trans_update:
msg += f', trans_update:{self.trans_update}'
if self.chmod: if self.chmod:
if isinstance(self.chmod, int) or len(self.chmod) == 3: if isinstance(self.chmod, int) or len(self.chmod) == 3:
msg += f', chmod:{self.chmod:o}' msg += f', chmod:{self.chmod:o}'
@@ -149,13 +149,13 @@ class Dotfile(DictParser):
for act in some: for act in some:
out += f'\n{2*indent}- {act}' out += f'\n{2*indent}- {act}'
out += f'\n{indent}trans_r:' out += f'\n{indent}trans_install:'
some = self.get_trans_r() some = self.get_trans_install()
if some: if some:
out += f'\n{2*indent}- {some}' out += f'\n{2*indent}- {some}'
out += f'\n{indent}trans_w:' out += f'\n{indent}trans_update:'
some = self.get_trans_w() some = self.get_trans_update()
if some: if some:
out += f'\n{2*indent}- {some}' out += f'\n{2*indent}- {some}'
return out return out

View File

@@ -75,8 +75,8 @@ class Importer:
def import_path(self, path, import_as=None, def import_path(self, path, import_as=None,
import_link=LinkTypes.NOLINK, import_link=LinkTypes.NOLINK,
import_mode=False, import_mode=False,
import_transw="", trans_install="",
import_transr=""): trans_update=""):
""" """
import a dotfile pointed by path import a dotfile pointed by path
returns: returns:
@@ -90,24 +90,25 @@ class Importer:
self.log.err(f'\"{path}\" does not exist, ignored!') self.log.err(f'\"{path}\" does not exist, ignored!')
return -1 return -1
# check transw if any # check trans_update if any
trans_write = None tinstall = None
trans_read = None tupdate = None
if import_transw: if trans_install:
trans_write = self.conf.get_trans_w(import_transw) tinstall = self.conf.get_trans_install(trans_install)
if import_transr: if trans_update:
trans_read = self.conf.get_trans_r(import_transr) tupdate = self.conf.get_trans_update(trans_update)
return self._import(path, import_as=import_as, return self._import(path, import_as=import_as,
import_link=import_link, import_link=import_link,
import_mode=import_mode, import_mode=import_mode,
trans_write=trans_write, trans_update=tupdate,
trans_read=trans_read) trans_install=tinstall)
def _import(self, path, import_as=None, def _import(self, path, import_as=None,
import_link=LinkTypes.NOLINK, import_link=LinkTypes.NOLINK,
import_mode=False, import_mode=False,
trans_write=None, trans_read=None): trans_install=None,
trans_update=None):
""" """
import path import path
returns: returns:
@@ -162,17 +163,18 @@ class Importer:
self.log.dbg(f'import dotfile: src:{src} dst:{dst}') self.log.dbg(f'import dotfile: src:{src} dst:{dst}')
if not self._import_to_dotpath(src, dst, trans_write=trans_write): if not self._import_to_dotpath(src, dst, trans_update=trans_update):
return -1 return -1
return self._import_in_config(path, src, dst, perm, linktype, return self._import_in_config(path, src, dst, perm, linktype,
import_mode, import_mode,
trans_w=trans_write, trans_update=trans_update,
trans_r=trans_read) trans_install=trans_install)
def _import_in_config(self, path, src, dst, perm, def _import_in_config(self, path, src, dst, perm,
linktype, import_mode, linktype, import_mode,
trans_r=None, trans_w=None): trans_install=None,
trans_update=None):
""" """
import path import path
returns: returns:
@@ -190,8 +192,8 @@ class Importer:
# add file to config file # add file to config file
retconf = self.conf.new_dotfile(src, dst, linktype, chmod=chmod, retconf = self.conf.new_dotfile(src, dst, linktype, chmod=chmod,
trans_read=trans_r, trans_install=trans_install,
trans_write=trans_w) trans_update=trans_update)
if not retconf: if not retconf:
self.log.warn(f'\"{path}\" ignored during import') self.log.warn(f'\"{path}\" ignored during import')
return 0 return 0
@@ -222,7 +224,7 @@ class Importer:
self.log.dbg('will overwrite existing file') self.log.dbg('will overwrite existing file')
return True return True
def _import_to_dotpath(self, in_dotpath, in_fs, trans_write=None): def _import_to_dotpath(self, in_dotpath, in_fs, trans_update=None):
""" """
prepare hierarchy for dotfile in dotpath and copy file prepare hierarchy for dotfile in dotpath and copy file
""" """
@@ -237,8 +239,8 @@ class Importer:
self.log.dry(f'would copy {in_fs} to {srcf}') self.log.dry(f'would copy {in_fs} to {srcf}')
return True return True
# apply trans_w # apply trans_update
in_fs = self._apply_trans_w(in_fs, trans_write) in_fs = self._apply_trans_update(in_fs, trans_update)
if not in_fs: if not in_fs:
# transformation failed # transformation failed
return False return False
@@ -290,7 +292,7 @@ class Importer:
return True return True
return False return False
def _apply_trans_w(self, path, trans): def _apply_trans_update(self, path, trans):
""" """
apply transformation to path on filesystem) apply transformation to path on filesystem)
returns returns

View File

@@ -138,6 +138,12 @@ class Installer:
actionexec=actionexec, actionexec=actionexec,
noempty=noempty, ignore=ignore, noempty=noempty, ignore=ignore,
is_template=is_template) is_template=is_template)
ret, err = self._copy_dir(templater, src, dst,
actionexec=actionexec,
noempty=noempty, ignore=ignore,
is_template=is_template,
chmod=chmod)
if self.remove_existing_in_dir and ins: if self.remove_existing_in_dir and ins:
self._remove_existing_in_dir(dst, ins) self._remove_existing_in_dir(dst, ins)
else: else:
@@ -186,40 +192,57 @@ class Installer:
if self.dry: if self.dry:
return self._log_install(ret, err) return self._log_install(ret, err)
# handle chmod self._apply_chmod_after_install(src, dst, ret, err,
# - on success (r, not err) chmod=chmod,
# - no change (not r, not err) force_chmod=force_chmod,
# but not when linktype=linktype)
# - error (not r, err)
# - aborted (not r, err) return self._log_install(ret, err)
# - special keyword "preserve"
def _apply_chmod_after_install(self, src, dst, ret, err,
chmod=None,
is_sub=False,
force_chmod=False,
linktype=LinkTypes.NOLINK):
"""
handle chmod after install
- on success (r, not err)
- no change (not r, not err)
but not when
- error (not r, err)
- aborted (not r, err)
- special keyword "preserve"
is_sub is used to specify if the file/dir is
part of a dotfile directory
"""
apply_chmod = linktype in [LinkTypes.NOLINK, LinkTypes.LINK_CHILDREN] apply_chmod = linktype in [LinkTypes.NOLINK, LinkTypes.LINK_CHILDREN]
apply_chmod = apply_chmod and os.path.exists(dst) apply_chmod = apply_chmod and os.path.exists(dst)
apply_chmod = apply_chmod and (ret or (not ret and not err)) apply_chmod = apply_chmod and (ret or (not ret and not err))
apply_chmod = apply_chmod and chmod != CfgYaml.chmod_ignore apply_chmod = apply_chmod and chmod != CfgYaml.chmod_ignore
if apply_chmod: if is_sub:
if not chmod: chmod = None
chmod = get_file_perm(src) if not apply_chmod:
self.log.dbg(f'applying chmod {chmod:o} to {dst}')
dstperms = get_file_perm(dst)
if dstperms != chmod:
# apply mode
msg = f'chmod {dst} to {chmod:o}'
if not force_chmod and self.safe and not self.log.ask(msg):
ret = False
err = 'aborted'
else:
if not self.comparing:
self.log.sub(f'chmod {dst} to {chmod:o}')
if chmodit(dst, chmod, debug=self.debug):
ret = True
else:
ret = False
err = 'chmod failed'
else:
self.log.dbg('no chmod applied') self.log.dbg('no chmod applied')
return
return self._log_install(ret, err) if not chmod:
chmod = get_file_perm(src)
self.log.dbg(f'dotfile in dotpath perm: {chmod:o}')
self.log.dbg(f'applying chmod {chmod:o} to {dst}')
dstperms = get_file_perm(dst)
if dstperms != chmod:
# apply mode
msg = f'chmod {dst} to {chmod:o}'
if not force_chmod and self.safe and not self.log.ask(msg):
ret = False
err = 'aborted'
else:
if not self.comparing:
self.log.sub(f'chmod {dst} to {chmod:o}')
if chmodit(dst, chmod, debug=self.debug):
ret = True
else:
ret = False
err = 'chmod failed'
def install_to_temp(self, templater, tmpdir, src, dst, def install_to_temp(self, templater, tmpdir, src, dst,
is_template=True, chmod=None, ignore=None, is_template=True, chmod=None, ignore=None,
@@ -465,6 +488,8 @@ class Installer:
return False, 'aborted' return False, 'aborted'
# remove symlink # remove symlink
if self.backup and not os.path.isdir(dst):
self._backup(dst)
overwrite = True overwrite = True
try: try:
removepath(dst) removepath(dst)
@@ -551,6 +576,7 @@ class Installer:
content = None content = None
if is_template: if is_template:
# template the file # template the file
self.log.dbg(f'it is a template: {src}')
saved = templater.add_tmp_vars(self._get_tmp_file_vars(src, dst)) saved = templater.add_tmp_vars(self._get_tmp_file_vars(src, dst))
try: try:
content = templater.generate(src) content = templater.generate(src)
@@ -580,7 +606,8 @@ class Installer:
def _copy_dir(self, templater, src, dst, def _copy_dir(self, templater, src, dst,
actionexec=None, noempty=False, actionexec=None, noempty=False,
ignore=None, is_template=True): ignore=None, is_template=True,
chmod=None):
""" """
install src to dst when is a directory install src to dst when is a directory
@@ -617,6 +644,9 @@ class Installer:
# error occured # error occured
return res, err, [] return res, err, []
self._apply_chmod_after_install(fpath, fdst, ret, err,
chmod=chmod, is_sub=True)
if res: if res:
# something got installed # something got installed
@@ -720,6 +750,7 @@ class Installer:
if os.path.lexists(dst): if os.path.lexists(dst):
# file/symlink exists # file/symlink exists
self.log.dbg(f'file already exists on filesystem: {dst}')
try: try:
os.stat(dst) os.stat(dst)
except OSError as exc: except OSError as exc:
@@ -745,6 +776,8 @@ class Installer:
if self.backup: if self.backup:
self._backup(dst) self._backup(dst)
else:
self.log.dbg(f'file does not exist on filesystem: {dst}')
# create hierarchy # create hierarchy
base = os.path.dirname(dst) base = os.path.dirname(dst)

View File

@@ -68,6 +68,7 @@ Usage:
dotdrop update [-VbfdkPz] [-c <path>] [-p <profile>] dotdrop update [-VbfdkPz] [-c <path>] [-p <profile>]
[-w <nb>] [-i <pattern>...] [<path>...] [-w <nb>] [-i <pattern>...] [<path>...]
dotdrop remove [-Vbfdk] [-c <path>] [-p <profile>] [<path>...] dotdrop remove [-Vbfdk] [-c <path>] [-p <profile>] [<path>...]
dotdrop uninstall [-Vbfd] [-c <path>] [-p <profile>] [<key>...]
dotdrop files [-VbTG] [-c <path>] [-p <profile>] dotdrop files [-VbTG] [-c <path>] [-p <profile>]
dotdrop detail [-Vb] [-c <path>] [-p <profile>] [<key>...] dotdrop detail [-Vb] [-c <path>] [-p <profile>] [<key>...]
dotdrop profiles [-VbG] [-c <path>] dotdrop profiles [-VbG] [-c <path>]
@@ -93,8 +94,8 @@ Options:
-P --show-patch Provide a one-liner to manually patch template. -P --show-patch Provide a one-liner to manually patch template.
-R --remove-existing Remove existing file on install directory. -R --remove-existing Remove existing file on install directory.
-s --as=<path> Import as a different path from actual path. -s --as=<path> Import as a different path from actual path.
--transr=<key> Associate trans_read key on import. --transr=<key> Associate trans_install key on import.
--transw=<key> Apply trans_write key on import. --transw=<key> Apply trans_update key on import.
-t --temp Install to a temporary directory for review. -t --temp Install to a temporary directory for review.
-T --template Only template dotfiles. -T --template Only template dotfiles.
-V --verbose Be verbose. -V --verbose Be verbose.
@@ -320,8 +321,8 @@ class Options(AttrMonitor):
self.import_ignore.extend(self.impignore) self.import_ignore.extend(self.impignore)
self.import_ignore.append(f'*{self.install_backup_suffix}') self.import_ignore.append(f'*{self.install_backup_suffix}')
self.import_ignore = uniq_list(self.import_ignore) self.import_ignore = uniq_list(self.import_ignore)
self.import_transw = self.args['--transw'] self.import_trans_install = self.args['--transr']
self.import_transr = self.args['--transr'] self.import_trans_update = self.args['--transw']
def _apply_args_update(self): def _apply_args_update(self):
"""update specifics""" """update specifics"""
@@ -342,6 +343,10 @@ class Options(AttrMonitor):
self.remove_path = self.args['<path>'] self.remove_path = self.args['<path>']
self.remove_iskey = self.args['--key'] self.remove_iskey = self.args['--key']
def _apply_args_uninstall(self):
"""uninstall specifics"""
self.uninstall_key = self.args['<key>']
def _apply_args_detail(self): def _apply_args_detail(self):
"""detail specifics""" """detail specifics"""
self.detail_keys = self.args['<key>'] self.detail_keys = self.args['<key>']
@@ -357,6 +362,7 @@ class Options(AttrMonitor):
self.cmd_update = self.args['update'] self.cmd_update = self.args['update']
self.cmd_detail = self.args['detail'] self.cmd_detail = self.args['detail']
self.cmd_remove = self.args['remove'] self.cmd_remove = self.args['remove']
self.cmd_uninstall = self.args['uninstall']
# adapt attributes based on arguments # adapt attributes based on arguments
self.safe = not self.args['--force'] self.safe = not self.args['--force']
@@ -405,6 +411,9 @@ class Options(AttrMonitor):
# "remove" specifics # "remove" specifics
self._apply_args_remove() self._apply_args_remove()
# "uninstall" specifics
self._apply_args_uninstall()
def _fill_attr(self): def _fill_attr(self):
"""create attributes from conf""" """create attributes from conf"""
# defined variables # defined variables

150
dotdrop/uninstaller.py Normal file
View File

@@ -0,0 +1,150 @@
"""
author: deadc0de6 (https://github.com/deadc0de6)
Copyright (c) 2023, deadc0de6
handle the un-installation of dotfiles
"""
import os
from dotdrop.logger import Logger
from dotdrop.utils import removepath
class Uninstaller:
"""dotfile uninstaller"""
def __init__(self, base='.', workdir='~/.config/dotdrop',
dry=False, safe=True, debug=False,
backup_suffix='.dotdropbak'):
"""
@base: directory path where to search for templates
@workdir: where to install template before symlinking
@dry: just simulate
@debug: enable debug
@backup_suffix: suffix for dotfile backup file
@safe: ask for any action
"""
base = os.path.expanduser(base)
base = os.path.normpath(base)
self.base = base
workdir = os.path.expanduser(workdir)
workdir = os.path.normpath(workdir)
self.workdir = workdir
self.dry = dry
self.safe = safe
self.debug = debug
self.backup_suffix = backup_suffix
self.log = Logger(debug=self.debug)
def uninstall(self, src, dst, linktype):
"""
uninstall dst
@src: dotfile source path in dotpath
@dst: dotfile destination path in the FS
@linktype: linktypes.LinkTypes
return
- True, None : success
- False, error_msg : error
"""
if not src or not dst:
self.log.dbg(f'cannot uninstall empty {src} or {dst}')
return True, None
# ensure exists
path = os.path.expanduser(dst)
path = os.path.normpath(path)
path = path.rstrip(os.sep)
if not os.path.isfile(path) and not os.path.isdir(path):
msg = f'cannot uninstall special file {path}'
return False, msg
if not os.path.exists(path):
self.log.dbg(f'cannot uninstall non existing {path}')
return True, None
msg = f'uninstalling \"{path}\" (link: {linktype})'
self.log.dbg(msg)
ret, msg = self._remove(path)
if ret:
if not self.dry:
self.log.sub(f'uninstall {dst}')
return ret, msg
def _descend(self, dirpath):
ret = True
self.log.dbg(f'recursively uninstall {dirpath}')
for sub in os.listdir(dirpath):
subpath = os.path.join(dirpath, sub)
if os.path.isdir(subpath):
self.log.dbg(f'under {dirpath} uninstall dir {subpath}')
self._descend(subpath)
else:
self.log.dbg(f'under {dirpath} uninstall file {subpath}')
subret, _ = self._remove(subpath)
if not subret:
ret = False
if not os.listdir(dirpath):
# empty
self.log.dbg(f'remove empty dir {dirpath}')
if self.dry:
self.log.dry(f'would \"rm -r {dirpath}\"')
return True, ''
return self._remove_path(dirpath)
self.log.dbg(f'not removing non-empty dir {dirpath}')
return ret, ''
def _remove_path(self, path):
"""remove a file"""
try:
removepath(path, self.log)
except OSError as exc:
err = f'removing \"{path}\" failed: {exc}'
return False, err
return True, ''
def _remove(self, path):
"""remove path"""
self.log.dbg(f'handling uninstall of {path}')
if path.endswith(self.backup_suffix):
self.log.dbg(f'skip {path} ignored')
return True, ''
backup = f'{path}{self.backup_suffix}'
if os.path.exists(backup):
self.log.dbg(f'backup exists for {path}: {backup}')
return self._replace(path, backup)
self.log.dbg(f'no backup file for {path}')
if os.path.isdir(path):
self.log.dbg(f'{path} is a directory')
return self._descend(path)
if self.dry:
self.log.dry(f'would \"rm {path}\"')
return True, ''
msg = f'Remove {path}?'
if self.safe and not self.log.ask(msg):
return False, 'user refused'
self.log.dbg(f'removing {path}')
return self._remove_path(path)
def _replace(self, path, backup):
"""replace path by backup"""
if self.dry:
self.log.dry(f'would \"mv {backup} {path}\"')
return True, ''
msg = f'Restore {path} from {backup}?'
if self.safe and not self.log.ask(msg):
return False, 'user refused'
try:
self.log.dbg(f'mv {backup} {path}')
os.replace(backup, path)
except OSError as exc:
err = f'replacing \"{path}\" by \"{backup}\" failed: {exc}'
return False, err
return True, ''

View File

@@ -122,7 +122,7 @@ class Updater:
return True return True
# apply write transformation if any # apply write transformation if any
new_path = self._apply_trans_w(deployed_path, dotfile) new_path = self._apply_trans_update(deployed_path, dotfile)
if not new_path: if not new_path:
return False return False
@@ -150,9 +150,9 @@ class Updater:
removepath(new_path, logger=self.log) removepath(new_path, logger=self.log)
return ret return ret
def _apply_trans_w(self, path, dotfile): def _apply_trans_update(self, path, dotfile):
"""apply write transformation to dotfile""" """apply write transformation to dotfile"""
trans = dotfile.get_trans_w() trans = dotfile.get_trans_update()
if not trans: if not trans:
return path return path
self.log.dbg(f'executing write transformation {trans}') self.log.dbg(f'executing write transformation {trans}')

View File

@@ -175,6 +175,8 @@ def removepath(path, logger=None):
return return
LOG.err(err) LOG.err(err)
raise OSError(err) raise OSError(err)
if logger:
logger.dbg(f'removing {path}')
try: try:
if os.path.islink(path) or os.path.isfile(path): if os.path.islink(path) or os.path.isfile(path):
os.unlink(path) os.unlink(path)

4
manpage/dotdrop.1 vendored
View File

@@ -105,11 +105,11 @@ Import as a different path from actual path.
.TP .TP
.B .B
\fB--transr\fP=<key> \fB--transr\fP=<key>
Associate trans_read key on import. Associate trans_install key on import.
.TP .TP
.B .B
\fB--transw\fP=<key> \fB--transw\fP=<key>
Apply trans_write key on import. Apply trans_update key on import.
.RE .RE
.TP .TP
.B .B

View File

@@ -39,8 +39,8 @@ COMMANDS
-m --preserve-mode Insert a chmod entry in the dotfile with its mode. -m --preserve-mode Insert a chmod entry in the dotfile with its mode.
-p --profile=<profile> Specify the profile to use. -p --profile=<profile> Specify the profile to use.
-s --as=<path> Import as a different path from actual path. -s --as=<path> Import as a different path from actual path.
--transr=<key> Associate trans_read key on import. --transr=<key> Associate trans_install key on import.
--transw=<key> Apply trans_write key on import. --transw=<key> Apply trans_update key on import.
compare Compare dotfiles compare Compare dotfiles
-C --file=<path> Path of dotfile to compare. -C --file=<path> Path of dotfile to compare.

View File

@@ -38,10 +38,12 @@ pyflakes --version
# checking for TODO/FIXME # checking for TODO/FIXME
echo "--------------------------------------" echo "--------------------------------------"
echo "checking for TODO/FIXME" echo "checking for TODO/FIXME"
grep -rv 'TODO\|FIXME' dotdrop/ >/dev/null 2>&1 set +e
grep -rv 'TODO\|FIXME' tests/ >/dev/null 2>&1 grep -r 'TODO\|FIXME' dotdrop/ && exit 1
grep -rv 'TODO\|FIXME' tests-ng/ >/dev/null 2>&1 grep -r 'TODO\|FIXME' tests/ && exit 1
grep -rv 'TODO\|FIXME' scripts/ >/dev/null 2>&1 grep -r 'TODO\|FIXME' tests-ng/ && exit 1
#grep -r 'TODO\|FIXME' scripts/ && exit 1
set -e
# checking for tests options # checking for tests options
echo "---------------------------------" echo "---------------------------------"
@@ -111,7 +113,7 @@ done
# check other python scripts # check other python scripts
echo "-----------------------------------------" echo "-----------------------------------------"
echo "checking other python scripts with pylint" echo "checking other python scripts with pylint"
find . -name "*.py" -not -path "./dotdrop/*" | while read -r script; do find . -name "*.py" -not -path "./dotdrop/*" -not -regex "\./\.?v?env/.*" | while read -r script; do
echo "checking ${script}" echo "checking ${script}"
pylint -sn \ pylint -sn \
--disable=R0914 \ --disable=R0914 \

View File

@@ -12,4 +12,4 @@ if [ -n "${WORKERS}" ]; then
fi fi
mkdir -p coverages/ mkdir -p coverages/
coverage run -p --data-file coverages/coverage -m pytest tests coverage run -p --data-file coverages/coverage -m pytest tests -x

View File

@@ -31,7 +31,10 @@ echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sg
# $2 path # $2 path
grep_or_fail() grep_or_fail()
{ {
grep "${1}" "${2}" >/dev/null 2>&1 || (echo "pattern not found in ${2}" && exit 1) if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "pattern not found in ${2}"
exit 1
fi
} }
# the action temp # the action temp

248
tests-ng/backup.sh vendored Executable file
View File

@@ -0,0 +1,248 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test for backups
# returns 1 in case of error
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 pattern
# $2 path
grep_or_fail()
{
if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "pattern \"${1}\" not found in ${2}"
exit 1
fi
}
# the dotfile source
tmps=$(mktemp -d --suffix='-dotdrop-tests-dotpath' || mktemp -d)
mkdir -p "${tmps}"/dotfiles
# the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests-dst' || mktemp -d)
tmpw=$(mktemp -d --suffix='-dotdrop-workdir' || mktemp -d)
clear_on_exit "${tmps}"
clear_on_exit "${tmpd}"
clear_on_exit "${tmpw}"
clear_dotpath()
{
rm -rf "${tmps:?}"/dotfiles/*
}
create_dotpath()
{
# create the dotfiles in dotpath
echo "modified" > "${tmps}"/dotfiles/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/template
mkdir -p "${tmps}"/dotfiles/dir
echo "modified" > "${tmps}"/dotfiles/dir/sub
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/dir/template
mkdir -p "${tmps}"/dotfiles/tree
echo "modified" > "${tmps}"/dotfiles/tree/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/tree/template
mkdir -p "${tmps}"/dotfiles/tree/sub
echo "modified" > "${tmps}"/dotfiles/tree/sub/file
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/tree/sub/template
}
clear_fs()
{
rm -rf "${tmpd:?}"/*
}
create_fs()
{
# create the existing dotfiles in filesystem
echo "original" > "${tmpd}"/file
echo "original" > "${tmpd}"/template
mkdir -p "${tmpd}"/dir
echo "original" > "${tmpd}"/dir/sub
echo "original" > "${tmpd}"/dir/template
mkdir -p "${tmpd}"/tree
echo "original" > "${tmpd}"/tree/file
echo "original" > "${tmpd}"/tree/template
mkdir -p "${tmpd}"/tree/sub
echo "original" > "${tmpd}"/tree/sub/file
echo "original" > "${tmpd}"/tree/sub/template
}
# create the config file
cfg="${tmps}/config.yaml"
# $1: linktype
create_config()
{
link_default="${1}"
link_file="${1}"
link_dir="${1}"
if [ "${link_default}" = "link_children" ]; then
link_file="nolink"
fi
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: ${link_default}
workdir: ${tmpw}
dotfiles:
f_file:
dst: ${tmpd}/file
src: file
link: ${link_file}
f_template:
dst: ${tmpd}/template
src: template
link: ${link_file}
d_dir:
dst: ${tmpd}/dir
src: dir
link: ${link_dir}
d_tree:
dst: ${tmpd}/tree
src: tree
link: ${link_dir}
profiles:
p1:
dotfiles:
- f_file
- f_template
- d_dir
- d_tree
_EOF
#cat ${cfg}
}
# install nolink
pre="link:nolink"
create_config "nolink"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
[ ! -e "${tmpd}"/dir/sub.dotdropbak ] && echo "${pre} dir sub backup not found" && exit 1
[ ! -e "${tmpd}"/dir/template.dotdropbak ] && echo "${pre} dir template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/file.dotdropbak ] && echo "${pre} tree file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/template.dotdropbak ] && echo "${pre} tree template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/sub/file.dotdropbak ] && echo "${pre} tree sub file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/sub/template.dotdropbak ] && echo "${pre} tree sub template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail original "${tmpd}"/dir/sub.dotdropbak
grep_or_fail original "${tmpd}"/dir/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/sub/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/sub/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install relative
pre="link:relative"
create_config "relative"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install absolute
pre="link:absolute"
create_config "absolute"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
# install link_children
pre="link:link_children"
create_config "link_children"
clear_dotpath
clear_fs
create_dotpath
create_fs
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 --verbose
# checks
[ ! -e "${tmpd}"/file.dotdropbak ] && echo "${pre} file backup not found" && exit 1
[ ! -e "${tmpd}"/template.dotdropbak ] && echo "${pre} template backup not found" && exit 1
[ ! -e "${tmpd}"/dir/sub.dotdropbak ] && echo "${pre} dir sub backup not found" && exit 1
[ ! -e "${tmpd}"/dir/template.dotdropbak ] && echo "${pre} dir template backup not found" && exit 1
[ ! -e "${tmpd}"/tree/file.dotdropbak ] && echo "${pre} tree file backup not found" && exit 1
[ ! -e "${tmpd}"/tree/template.dotdropbak ] && echo "${pre} tree template backup not found" && exit 1
grep_or_fail original "${tmpd}"/file.dotdropbak
grep_or_fail original "${tmpd}"/template.dotdropbak
grep_or_fail original "${tmpd}"/dir/sub.dotdropbak
grep_or_fail original "${tmpd}"/dir/template.dotdropbak
grep_or_fail original "${tmpd}"/tree/file.dotdropbak
grep_or_fail original "${tmpd}"/tree/template.dotdropbak
grep_or_fail p1 "${tmpd}"/template
grep_or_fail modified "${tmpd}"/dir/sub
grep_or_fail p1 "${tmpd}"/dir/template
grep_or_fail modified "${tmpd}"/tree/file
grep_or_fail p1 "${tmpd}"/tree/template
grep_or_fail modified "${tmpd}"/tree/sub/file
grep_or_fail p1 "${tmpd}"/tree/sub/template
echo "OK"
exit 0

103
tests-ng/chmod-install-dir.sh vendored Executable file
View File

@@ -0,0 +1,103 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test chmod dir sub file on install
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 path
# $2 rights
has_rights()
{
echo "testing ${1} is ${2}"
[ ! -e "$1" ] && echo "$(basename "$1") does not exist" && exit 1
local mode
mode=$(stat -L -c '%a' "$1")
[ "${mode}" != "$2" ] && echo "bad mode for $(basename "$1") (${mode} VS expected ${2})" && exit 1
true
}
# the dotfile source
tmps=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
mkdir -p "${tmps}"/dotfiles
# the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
#echo "dotfile destination: ${tmpd}"
clear_on_exit "${tmps}"
clear_on_exit "${tmpd}"
# create the config file
cfg="${tmps}/config.yaml"
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
force_chmod: true
dotfiles:
d_dir:
src: dir
dst: ${tmpd}/dir
profiles:
p1:
dotfiles:
- d_dir
_EOF
#cat ${cfg}
mkdir -p "${tmps}"/dotfiles/dir
echo 'file1' > "${tmps}"/dotfiles/dir/file1
chmod 700 "${tmps}"/dotfiles/dir/file1
echo 'file2' > "${tmps}"/dotfiles/dir/file2
chmod 777 "${tmps}"/dotfiles/dir/file2
echo 'file3' > "${tmps}"/dotfiles/dir/file3
chmod 644 "${tmps}"/dotfiles/dir/file3
ls -l "${tmps}"/dotfiles/dir/
# install
echo "install (1)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V
has_rights "${tmpd}/dir/file1" "700"
has_rights "${tmpd}/dir/file2" "777"
has_rights "${tmpd}/dir/file3" "644"
# modify
chmod 666 "${tmpd}/dir/file1"
chmod 666 "${tmpd}/dir/file2"
chmod 666 "${tmpd}/dir/file3"
# install
echo "install (2)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V
has_rights "${tmpd}/dir/file1" "700"
has_rights "${tmpd}/dir/file2" "777"
has_rights "${tmpd}/dir/file3" "644"
echo "OK"
exit 0

View File

@@ -61,42 +61,6 @@ clear_on_exit "${tmpd}"
# create the config file # create the config file
cfg="${tmps}/config.yaml" cfg="${tmps}/config.yaml"
echo 'f777' > "${tmps}"/dotfiles/f777
chmod 700 "${tmps}"/dotfiles/f777
echo 'link' > "${tmps}"/dotfiles/link
chmod 777 "${tmps}"/dotfiles/link
mkdir -p "${tmps}"/dotfiles/dir
echo "f1" > "${tmps}"/dotfiles/dir/f1
echo "exists" > "${tmps}"/dotfiles/exists
chmod 644 "${tmps}"/dotfiles/exists
echo "exists" > "${tmpd}"/exists
chmod 644 "${tmpd}"/exists
echo "existslink" > "${tmps}"/dotfiles/existslink
chmod 777 "${tmps}"/dotfiles/existslink
chmod 644 "${tmpd}"/exists
mkdir -p "${tmps}"/dotfiles/direxists
echo "f1" > "${tmps}"/dotfiles/direxists/f1
mkdir -p "${tmpd}"/direxists
echo "f1" > "${tmpd}"/direxists/f1
chmod 644 "${tmpd}"/direxists/f1
chmod 744 "${tmpd}"/direxists
mkdir -p "${tmps}"/dotfiles/linkchildren
echo "f1" > "${tmps}"/dotfiles/linkchildren/f1
mkdir -p "${tmps}"/dotfiles/linkchildren/d1
echo "f2" > "${tmps}"/dotfiles/linkchildren/d1/f2
echo '{{@@ profile @@}}' > "${tmps}"/dotfiles/symlinktemplate
mkdir -p "${tmps}"/dotfiles/symlinktemplatedir
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/symlinktemplatedir/t
echo 'nomode' > "${tmps}"/dotfiles/nomode
cat > "${cfg}" << _EOF cat > "${cfg}" << _EOF
config: config:
backup: true backup: true
@@ -170,6 +134,42 @@ profiles:
_EOF _EOF
#cat ${cfg} #cat ${cfg}
# create the dotfiles
echo 'f777' > "${tmps}"/dotfiles/f777
chmod 700 "${tmps}"/dotfiles/f777
echo 'link' > "${tmps}"/dotfiles/link
chmod 777 "${tmps}"/dotfiles/link
mkdir -p "${tmps}"/dotfiles/dir
echo "f1" > "${tmps}"/dotfiles/dir/f1
echo "exists" > "${tmps}"/dotfiles/exists
chmod 644 "${tmps}"/dotfiles/exists
echo "exists" > "${tmpd}"/exists
chmod 644 "${tmpd}"/exists
echo "existslink" > "${tmps}"/dotfiles/existslink
chmod 777 "${tmps}"/dotfiles/existslink
chmod 644 "${tmpd}"/exists
mkdir -p "${tmps}"/dotfiles/direxists
echo "f1" > "${tmps}"/dotfiles/direxists/f1
mkdir -p "${tmpd}"/direxists
echo "f1" > "${tmpd}"/direxists/f1
chmod 644 "${tmpd}"/direxists/f1
chmod 744 "${tmpd}"/direxists
mkdir -p "${tmps}"/dotfiles/linkchildren
echo "f1" > "${tmps}"/dotfiles/linkchildren/f1
mkdir -p "${tmps}"/dotfiles/linkchildren/d1
echo "f2" > "${tmps}"/dotfiles/linkchildren/d1/f2
echo '{{@@ profile @@}}' > "${tmps}"/dotfiles/symlinktemplate
mkdir -p "${tmps}"/dotfiles/symlinktemplatedir
echo "{{@@ profile @@}}" > "${tmps}"/dotfiles/symlinktemplatedir/t
echo 'nomode' > "${tmps}"/dotfiles/nomode
# install # install
echo "first install round" echo "first install round"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 -V

View File

@@ -34,19 +34,21 @@ echo "dotfiles source (dotpath): ${tmps}"
# the dotfile destination # the dotfile destination
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d) tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
echo "dotfiles destination: ${tmpd}" echo "dotfiles destination: ${tmpd}"
tmptmp=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
clear_on_exit "${tmps}" clear_on_exit "${tmps}"
clear_on_exit "${tmpd}" clear_on_exit "${tmpd}"
clear_on_exit "${tmptmp}"
# create the config file # create the config file
cfg="${tmps}/config.yaml" cfg="${tmps}/config.yaml"
cat > "${cfg}" << _EOF cat > "${cfg}" << _EOF
trans_read: trans_install:
base64: "cat {0} | base64 -d > {1}" base64: "cat {0} | base64 -d > {1}"
decompress: "mkdir -p {1} && tar -xf {0} -C {1}" decompress: "mkdir -p {1} && tar -xf {0} -C {1}"
decrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d {0} > {1}" decrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d {0} > {1}"
trans_write: trans_update:
base64: "cat {0} | base64 > {1}" base64: "cat {0} | base64 > {1}"
compress: "tar -cf {1} -C {0} ." compress: "tar -cf {1} -C {0} ."
encrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}" encrypt: "echo {{@@ profile @@}} | gpg -q --batch --yes --passphrase-fd 0 --no-tty -o {1} -c {0}"
@@ -90,16 +92,16 @@ cd "${ddpath}" | ${bin} import -f -c "${cfg}" -p p1 -b -V --transw=encrypt --tra
# check content in dotpath # check content in dotpath
echo "checking content" echo "checking content"
file "${tmps}"/dotfiles/"${tmpd}"/abc | grep -i 'text' file "${tmps}"/dotfiles/"${tmpd}"/abc | grep -i 'text'
cat "${tmpd}"/abc | base64 > "${tmps}"/test-abc cat "${tmpd}"/abc | base64 > "${tmptmp}"/test-abc
diff "${tmps}"/dotfiles/"${tmpd}"/abc "${tmps}"/test-abc diff "${tmps}"/dotfiles/"${tmpd}"/abc "${tmptmp}"/test-abc
file "${tmps}"/dotfiles/"${tmpd}"/def | grep -i 'tar' file "${tmps}"/dotfiles/"${tmpd}"/def | grep -i 'tar'
tar -cf "${tmps}"/test-def -C "${tmpd}"/def . tar -cf "${tmptmp}"/test-def -C "${tmpd}"/def .
diff "${tmps}"/dotfiles/"${tmpd}"/def "${tmps}"/test-def diff "${tmps}"/dotfiles/"${tmpd}"/def "${tmptmp}"/test-def
file "${tmps}"/dotfiles/"${tmpd}"/ghi | grep -i 'gpg symmetrically encrypted data\|PGP symmetric key encrypted data' file "${tmps}"/dotfiles/"${tmpd}"/ghi | grep -i 'gpg symmetrically encrypted data\|PGP symmetric key encrypted data'
echo p1 | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d "${tmps}"/dotfiles/"${tmpd}"/ghi > "${tmps}"/test-ghi echo p1 | gpg -q --batch --yes --passphrase-fd 0 --no-tty -d "${tmps}"/dotfiles/"${tmpd}"/ghi > "${tmptmp}"/test-ghi
diff "${tmps}"/test-ghi "${tmpd}"/ghi diff "${tmptmp}"/test-ghi "${tmpd}"/ghi
# check is imported in config # check is imported in config
echo "checking imported in config" echo "checking imported in config"
@@ -108,33 +110,33 @@ cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^f_abc'
cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^d_def' cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^d_def'
cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^f_ghi' cd "${ddpath}" | ${bin} -p p1 -c "${cfg}" files | grep '^f_ghi'
# check has trans_write and trans_read in config # check has trans_update and trans_install in config
echo "checking trans_write is set in config" echo "checking trans_update is set in config"
echo "--------------" echo "--------------"
cat "${cfg}" cat "${cfg}"
echo "--------------" echo "--------------"
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_write: base64' cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_update: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_write: compress' cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_update: compress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_write: encrypt' cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_update: encrypt'
cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_read: base64' cat "${cfg}" | grep -A 4 'f_abc:' | grep 'trans_install: base64'
cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_read: decompress' cat "${cfg}" | grep -A 4 'd_def:' | grep 'trans_install: decompress'
cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_read: decrypt' cat "${cfg}" | grep -A 4 'f_ghi:' | grep 'trans_install: decrypt'
# install these # install these
echo "install and check" echo "install and check"
rm "${tmpd}"/abc rm -rf "${tmpd:?}"/*
rm -r "${tmpd}"/def
rm "${tmpd}"/ghi
cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 -b -V cd "${ddpath}" | ${bin} install -f -c "${cfg}" -p p1 -b -V
# test exist # test exist
echo "check exist" echo "check exist"
[ ! -e "${tmpd}"/abc ] && exit 1 cat "${cfg}"
[ ! -d "${tmpd}"/def/a ] && exit 1 tree "${tmpd}"
[ ! -e "${tmpd}"/def/a/file ] && exit 1 [ ! -e "${tmpd}"/abc ] && echo "${tmpd}/abc does not exist" && exit 1
[ ! -e "${tmpd}"/ghi ] && exit 1 [ ! -d "${tmpd}"/def/a ] && echo "${tmpd}/def/a does not exist" && exit 1
[ ! -e "${tmpd}"/def/a/file ] && echo "${tmpd}/def/a/file does not exist" && exit 1
[ ! -e "${tmpd}"/ghi ] && echo "${tmpd}/ghi does not exist" && exit 1
# test content # test content
echo "check content" echo "check content"

View File

@@ -108,8 +108,10 @@ def run_tests(max_jobs=None, stop_on_first_err=True, with_spinner=True):
failed += 1 failed += 1
print() print()
if stop_on_first_err: if stop_on_first_err:
print(log_out) if log_out:
print(log_err) print(log_out)
if log_err:
print(log_err)
print(f'test \"{name}\" failed ({ret}): {reason}') print(f'test \"{name}\" failed ({ret}): {reason}')
if stop_on_first_err: if stop_on_first_err:
ex.shutdown(wait=False) ex.shutdown(wait=False)

303
tests-ng/uninstall.sh vendored Executable file
View File

@@ -0,0 +1,303 @@
#!/usr/bin/env bash
# author: deadc0de6 (https://github.com/deadc0de6)
# Copyright (c) 2023, deadc0de6
#
# test uninstall (no symlink)
# returns 1 in case of error
#
## start-cookie
set -euo errtrace pipefail
cur=$(cd "$(dirname "${0}")" && pwd)
ddpath="${cur}/../"
PPATH="{PYTHONPATH:-}"
export PYTHONPATH="${ddpath}:${PPATH}"
altbin="python3 -m dotdrop.dotdrop"
if hash coverage 2>/dev/null; then
mkdir -p coverages/
altbin="coverage run -p --data-file coverages/coverage --source=dotdrop -m dotdrop.dotdrop"
fi
bin="${DT_BIN:-${altbin}}"
# shellcheck source=tests-ng/helpers
source "${cur}"/helpers
echo -e "$(tput setaf 6)==> RUNNING $(basename "${BASH_SOURCE[0]}") <==$(tput sgr0)"
## end-cookie
################################################################
# this is the test
################################################################
# $1 pattern
# $2 path
grep_or_fail()
{
if ! grep "${1}" "${2}" >/dev/null 2>&1; then
echo "${PRE} pattern \"${1}\" not found in ${2}"
exit 1
fi
}
# $1: basedir
# $2: content
create_hierarchy()
{
echo "${2}" > "${1}"/x
mkdir -p "${1}"/y
echo "${2}" > "${1}"/y/file
mkdir -p "${1}"/y/subdir
echo "${2}" > "${1}"/y/subdir/subfile
echo "profile: ${PRO_TEMPL}" > "${1}"/t
mkdir -p "${1}"/z
echo "profile t1: ${PRO_TEMPL}" > "${1}"/z/t1
echo "profile t2: ${PRO_TEMPL}" > "${1}"/z/t2
echo "${2}" > "${1}"/z/file
echo "trans:${PRO_TEMPL}" > "${1}"/trans
}
# $1: basedir
clean_hierarchy()
{
rm -rf "${1:?}"/*
}
uninstall_with_link()
{
set -e
LINK_TYPE="${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE:-nolink}"
PRE="[link:${LINK_TYPE}] ERROR"
PRO_TEMPL="{{@@ profile @@}}"
DT_ARG="--verbose"
# dotdrop directory
basedir=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
mkdir -p "${basedir}"/dotfiles
echo "[+] dotdrop dir: ${basedir}"
echo "[+] dotpath dir: ${basedir}/dotfiles"
tmpd=$(mktemp -d --suffix='-dotdrop-tests' || mktemp -d)
tmpw=$(mktemp -d --suffix='-dotdrop-workdir' || mktemp -d)
clear_on_exit "${basedir}/dotfiles"
clear_on_exit "${tmpd}"
clear_on_exit "${tmpw}"
file_link="${LINK_TYPE}"
dir_link="${LINK_TYPE}"
if [ "${LINK_TYPE}" = "link_children" ]; then
file_link="absolute"
fi
# create the config file
cfg="${basedir}/config.yaml"
cat > "${cfg}" << _EOF
config:
backup: true
create: true
dotpath: dotfiles
link_dotfile_default: ${LINK_TYPE}
workdir: ${tmpw}
dotfiles:
f_x:
src: x
dst: ${tmpd}/x
link: ${file_link}
d_y:
src: y
dst: ${tmpd}/y
link: ${dir_link}
f_t:
src: t
dst: ${tmpd}/t
link: ${file_link}
d_z:
src: z
dst: ${tmpd}/z
link: ${dir_link}
f_trans:
src: trans
dst: ${tmpd}/trans
link: ${file_link}
profiles:
p1:
dotfiles:
- f_x
- d_y
- f_t
- d_z
- f_trans
_EOF
#########################
## no original
#########################
create_hierarchy "${basedir}/dotfiles" "modified"
# install
echo "[+] install (1)"
( \
cd "${ddpath}" && ${bin} install -c "${cfg}" -f -p p1 | grep '^4 dotfile(s) installed.$' \
)
# tests
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x not installed" && exit 1
[ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not installed" && exit 1
[ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not installed" && exit 1
[ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not installed" && exit 1
[ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not installed" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not installed" && exit 1
grep_or_fail 'modified' "${tmpd}"/x
grep_or_fail 'modified' "${tmpd}"/y/file
grep_or_fail 'profile: p1' "${tmpd}"/t
grep_or_fail 'profile t1: p1' "${tmpd}"/z/t1
grep_or_fail 'profile t2: p1' "${tmpd}"/z/t2
grep_or_fail 'modified' "${tmpd}"/z/file
grep_or_fail 'trans:p1' "${tmpd}"/trans
# uninstall
echo "[+] uninstall (1)"
( \
cd "${ddpath}" && ${bin} uninstall -c "${cfg}" -f -p p1 "${DT_ARG}" \
)
[ "$?" != "0" ] && exit 1
# tests
[ ! -d "${basedir}"/dotfiles ] && echo "${PRE} dotpath removed" && exit 1
[ -e "${tmpd}"/x ] && echo "${PRE} f_x not uninstalled" && exit 1
[ -d "${tmpd}"/y ] && echo "${PRE} d_y dir not uninstalled" && exit 1
[ -e "${tmpd}"/y/file ] && echo "${PRE} d_y file not uninstalled" && exit 1
[ -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y subfile not uninstalled" && exit 1
[ -e "${tmpd}"/t ] && echo "${PRE} f_t not uninstalled" && exit 1
[ -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z subfile t1 not uninstalled" && exit 1
[ -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z subfile t2 not uninstalled" && exit 1
[ -e "${tmpd}"/z/file ] && echo "${PRE} d_z subfile file not uninstalled" && exit 1
[ -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not uninstalled" && exit 1
# test workdir is empty
if [ -n "$(ls -A "${tmpw}")" ]; then
echo "${PRE} workdir (1) is not empty"
echo "---"
ls -A "${tmpw}"
echo "---"
exit 1
fi
#########################
## with original
#########################
# clean
clean_hierarchy "${tmpd}"
clean_hierarchy "${basedir}"/dotfiles
# recreate
create_hierarchy "${basedir}"/dotfiles "modified"
create_hierarchy "${tmpd}" "original"
# install
echo "[+] install (2)"
cd "${ddpath}" | ${bin} install -c "${cfg}" -f -p p1 | grep '^4 dotfile(s) installed.$'
# tests
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x not installed" && exit 1
[ ! -e "${tmpd}"/x.dotdropbak ] && echo "${PRE} f_x backup not created" && exit 1
[ ! -d "${tmpd}"/y ] && echo "${PRE} d_y not installed" && exit 1
[ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y file not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/file.dotdropbak ] && echo "${PRE} d_y backup file not created" && exit 1
[ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y subfile not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/subdir/subfile.dotdropbak ] && echo "${PRE} d_y subfile backup not created" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not installed" && exit 1
[ ! -e "${tmpd}"/t.dotdropbak ] && echo "${PRE} f_t backup not created" && exit 1
[ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t1.dotdropbak ] && echo "${PRE} d_z t1 backup not created" && exit 1
[ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t2.dotdropbak ] && echo "${PRE} d_z t2 backup not created" && exit 1
[ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not installed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/file.dotdropbak ] && echo "${PRE} d_z backup file not created" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans file not installed" && exit 1
[ ! -e "${tmpd}"/trans.dotdropbak ] && echo "${PRE} f_trans backup file not created" && exit 1
grep_or_fail 'modified' "${tmpd}"/x
grep_or_fail 'modified' "${tmpd}"/y/file
grep_or_fail 'profile: p1' "${tmpd}"/t
grep_or_fail 'profile t1: p1' "${tmpd}"/z/t1
grep_or_fail 'profile t2: p1' "${tmpd}"/z/t2
grep_or_fail 'modified' "${tmpd}"/z/file
grep_or_fail 'trans:p1' "${tmpd}"/trans
# uninstall
echo "[+] uninstall (2)"
( \
cd "${ddpath}" && ${bin} uninstall -c "${cfg}" -f -p p1 "${DT_ARG}" \
)
[ "$?" != "0" ] && exit 1
# tests
[ ! -d "${basedir}"/dotfiles ] && echo "${PRE} dotpath removed" && exit 1
[ ! -e "${tmpd}"/x ] && echo "${PRE} f_x backup not restored" && exit 1
[ -e "${tmpd}"/x.dotdropbak ] && echo "${PRE} f_x backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -d "${tmpd}"/y ] && echo "${PRE} d_y backup not restored" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/file ] && echo "${PRE} d_y file backup not restored" && exit 1
[ -e "${tmpd}"/y/file.dotdropbak ] && echo "${PRE} d_y backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/y/subdir/subfile ] && echo "${PRE} d_y sub backup not restored" && exit 1
[ -e "${tmpd}"/y/subdir/subfile.dotdropbak ] && echo "${PRE} d_y sub backup not removed" && exit 1
[ ! -e "${tmpd}"/t ] && echo "${PRE} f_t not restored" && exit 1
[ -e "${tmpd}"/t.dotdropbak ] && echo "${PRE} f_t backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t1 ] && echo "${PRE} d_z t1 not restore" && exit 1
[ -e "${tmpd}"/z/t1.dotdropbak ] && echo "${PRE} d_z t1 backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/t2 ] && echo "${PRE} d_z t2 not restored" && exit 1
[ -e "${tmpd}"/z/t2.dotdropbak ] && echo "${PRE} d_z t2 backup not removed" && exit 1
[ "${LINK_TYPE}" = "nolink" ] && [ ! -e "${tmpd}"/z/file ] && echo "${PRE} d_z file not restored" && exit 1
[ -e "${tmpd}"/z/file.dotdropbak ] && echo "${PRE} d_z file backup not removed" && exit 1
[ ! -e "${tmpd}"/trans ] && echo "${PRE} f_trans backup not restored" && exit 1
[ -e "${tmpd}"/trans.dotdropbak ] && echo "${PRE} f_trans backup not removed" && exit 1
grep_or_fail 'original' "${tmpd}"/x
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail 'original' "${tmpd}"/y/file
grep_or_fail "profile: ${PRO_TEMPL}" "${tmpd}/t"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail "profile t1: ${PRO_TEMPL}" "${tmpd}/z/t1"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail "profile t2: ${PRO_TEMPL}" "${tmpd}/z/t2"
[ "${LINK_TYPE}" = "nolink" ] && grep_or_fail 'original' "${tmpd}"/z/file
grep_or_fail "trans:${PRO_TEMPL}" "${tmpd}"/trans
echo "testing workdir..."
# test workdir is empty
if [ -n "$(ls -A "${tmpw}")" ]; then
echo "${PRE} workdir (2) - ${tmpw} - is not empty"
ls -r "${tmpw}"
exit 1
fi
echo "${PRE} done OK"
}
export DOTDROP_TEST_NG_UNINSTALL_DDPATH="${ddpath}"
export DOTDROP_TEST_NG_UNINSTALL_BIN="${bin}"
export DOTDROP_TEST_NG_CUR="${cur}"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="nolink"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="absolute"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="relative"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
export DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE="link_children"
# shellcheck source=uninstall_
echo "[+] testing uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE}..."
if ! uninstall_with_link; then exit 1; fi
echo "[+] uninstall link:${DOTDROP_TEST_NG_UNINSTALL_LINK_TYPE} OK"
echo "OK"
exit 0

View File

@@ -149,6 +149,7 @@ def _fake_args():
args['profiles'] = False args['profiles'] = False
args['files'] = False args['files'] = False
args['install'] = False args['install'] = False
args['uninstall'] = False
args['compare'] = False args['compare'] = False
args['import'] = False args['import'] = False
args['update'] = False args['update'] = False
@@ -247,7 +248,7 @@ def create_yaml_keyval(pairs, parent_dir=None, top_key=None):
# pylint: disable=W0102 # pylint: disable=W0102
def populate_fake_config(config, dotfiles={}, profiles={}, actions={}, def populate_fake_config(config, dotfiles={}, profiles={}, actions={},
trans={}, trans_write={}, variables={}, trans_install={}, trans_update={}, variables={},
dynvariables={}): dynvariables={}):
"""Adds some juicy content to config files""" """Adds some juicy content to config files"""
is_path = isinstance(config, str) is_path = isinstance(config, str)
@@ -258,8 +259,8 @@ def populate_fake_config(config, dotfiles={}, profiles={}, actions={},
config['dotfiles'] = dotfiles config['dotfiles'] = dotfiles
config['profiles'] = profiles config['profiles'] = profiles
config['actions'] = actions config['actions'] = actions
config['trans_read'] = trans config['trans_install'] = trans_install
config['trans_write'] = trans_write config['trans_update'] = trans_update
config['variables'] = variables config['variables'] = variables
config['dynvariables'] = dynvariables config['dynvariables'] = dynvariables

View File

@@ -239,10 +239,10 @@ class TestImport(unittest.TestCase):
}, },
'a_log_ed': 'echo 2', 'a_log_ed': 'echo 2',
}, },
'trans': { 'trans_install': {
't_log_ed': 'echo 3', 't_log_ed': 'echo 3',
}, },
'trans_write': { 'trans_update': {
'tw_log_ed': 'echo 4', 'tw_log_ed': 'echo 4',
}, },
'variables': { 'variables': {
@@ -273,10 +273,10 @@ class TestImport(unittest.TestCase):
}, },
'a_log_ing': 'echo a', 'a_log_ing': 'echo a',
}, },
'trans': { 'trans_install': {
't_log_ing': 'echo b', 't_log_ing': 'echo b',
}, },
'trans_write': { 'trans_update': {
'tw_log_ing': 'echo c', 'tw_log_ing': 'echo c',
}, },
'variables': { 'variables': {
@@ -352,10 +352,10 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(a.endswith('ing') for a in actions)) self.assertFalse(any(a.endswith('ing') for a in actions))
# testing transformations # testing transformations
transformations = ycont['trans_read'].keys() transformations = ycont['trans_install'].keys()
self.assertTrue(all(t.endswith('ed') for t in transformations)) self.assertTrue(all(t.endswith('ed') for t in transformations))
self.assertFalse(any(t.endswith('ing') for t in transformations)) self.assertFalse(any(t.endswith('ing') for t in transformations))
transformations = ycont['trans_write'].keys() transformations = ycont['trans_update'].keys()
self.assertTrue(all(t.endswith('ed') for t in transformations)) self.assertTrue(all(t.endswith('ed') for t in transformations))
self.assertFalse(any(t.endswith('ing') for t in transformations)) self.assertFalse(any(t.endswith('ing') for t in transformations))
@@ -394,10 +394,10 @@ class TestImport(unittest.TestCase):
self.assertFalse(any(action.endswith('ed') for action in actions)) self.assertFalse(any(action.endswith('ed') for action in actions))
# testing transformations # testing transformations
transformations = ycont['trans_read'].keys() transformations = ycont['trans_install'].keys()
self.assertTrue(all(t.endswith('ing') for t in transformations)) self.assertTrue(all(t.endswith('ing') for t in transformations))
self.assertFalse(any(t.endswith('ed') for t in transformations)) self.assertFalse(any(t.endswith('ed') for t in transformations))
transformations = ycont['trans_write'].keys() transformations = ycont['trans_update'].keys()
self.assertTrue(all(t.endswith('ing') for t in transformations)) self.assertTrue(all(t.endswith('ing') for t in transformations))
self.assertFalse(any(t.endswith('ed') for t in transformations)) self.assertFalse(any(t.endswith('ed') for t in transformations))

View File

@@ -28,7 +28,7 @@ def fake_config(path, dotfiles, profile,
file.write('actions:\n') file.write('actions:\n')
for action in actions: for action in actions:
file.write(f' {action.key}: {action.action}\n') file.write(f' {action.key}: {action.action}\n')
file.write('trans:\n') file.write('trans_install:\n')
for trans in transs: for trans in transs:
file.write(f' {trans.key}: {trans.action}\n') file.write(f' {trans.key}: {trans.action}\n')
file.write('config:\n') file.write('config:\n')
@@ -46,9 +46,9 @@ def fake_config(path, dotfiles, profile,
file.write(' actions:\n') file.write(' actions:\n')
for action in dotfile.actions: for action in dotfile.actions:
file.write(f' - {action.key}\n') file.write(f' - {action.key}\n')
if dotfile.trans_r: if dotfile.trans_install:
for trans in dotfile.trans_r: for trans in dotfile.trans_install:
file.write(f' trans_read: {trans.key}\n') file.write(f' trans_install: {trans.key}\n')
file.write('profiles:\n') file.write('profiles:\n')
file.write(f' {profile}:\n') file.write(f' {profile}:\n')
file.write(' dotfiles:\n') file.write(' dotfiles:\n')
@@ -174,7 +174,7 @@ exec bspwm
fcontent9, _ = create_random_file(tmp, content=trans1) fcontent9, _ = create_random_file(tmp, content=trans1)
dst9 = os.path.join(dst, get_string(6)) dst9 = os.path.join(dst, get_string(6))
dotfile9 = Dotfile(get_string(6), dst9, os.path.basename(fcontent9), dotfile9 = Dotfile(get_string(6), dst9, os.path.basename(fcontent9),
trans_r=[the_trans]) trans_install=[the_trans])
# to test template # to test template
f10, _ = create_random_file(tmp, content='{{@@ header() @@}}') f10, _ = create_random_file(tmp, content='{{@@ header() @@}}')

View File

@@ -127,7 +127,7 @@ class TestImporter(unittest.TestCase):
path, _ = create_random_file(tmpdir) path, _ = create_random_file(tmpdir)
imp = Importer('profile', None, '', '', {}) imp = Importer('profile', None, '', '', {})
self.assertEqual(imp._apply_trans_w(path, trans), None) self.assertEqual(imp._apply_trans_update(path, trans), None)
class TestActions(unittest.TestCase): class TestActions(unittest.TestCase):

View File

@@ -121,7 +121,7 @@ class TestUpdate(unittest.TestCase):
# retrieve the path of the sub in the dotpath # retrieve the path of the sub in the dotpath
d1indotpath = os.path.join(opt.dotpath, dotfile.src) d1indotpath = os.path.join(opt.dotpath, dotfile.src)
d1indotpath = os.path.expanduser(d1indotpath) d1indotpath = os.path.expanduser(d1indotpath)
dotfile.trans_w = trans dotfile.trans_update = trans
# update template # update template
opt.update_path = [d3t] opt.update_path = [d3t]

View File

@@ -298,10 +298,10 @@ profiles:
}, },
'a_log_ed': 'echo 2', 'a_log_ed': 'echo 2',
}, },
'trans': { 'trans_install': {
't_log_ed': 'echo 3', 't_log_ed': 'echo 3',
}, },
'trans_write': { 'trans_update': {
'tw_log_ed': 'echo 4', 'tw_log_ed': 'echo 4',
}, },
'variables': { 'variables': {
@@ -335,10 +335,10 @@ profiles:
}, },
'a_log_ing': 'echo a', 'a_log_ing': 'echo a',
}, },
'trans': { 'trans_install': {
't_log_ing': 'echo b', 't_log_ing': 'echo b',
}, },
'trans_write': { 'trans_update': {
'tw_log_ing': 'echo c', 'tw_log_ing': 'echo c',
}, },
'variables': { 'variables': {
@@ -406,8 +406,8 @@ profiles:
self.assert_is_subset(post_ed, post_ing) self.assert_is_subset(post_ed, post_ing)
# test transactions # test transactions
self.assert_is_subset(imported_cfg.trans_r, importing_cfg.trans_r) self.assert_is_subset(imported_cfg.trans_install, importing_cfg.trans_install)
self.assert_is_subset(imported_cfg.trans_w, importing_cfg.trans_w) self.assert_is_subset(imported_cfg.trans_update, importing_cfg.trans_update)
# test variables # test variables
imported_vars = { imported_vars = {
@@ -504,10 +504,10 @@ profiles:
}, },
'a_log': 'echo 2', 'a_log': 'echo 2',
}, },
'trans': { 'trans_install': {
't_log': 'echo 3', 't_log': 'echo 3',
}, },
'trans_write': { 'trans_update': {
'tw_log': 'echo 4', 'tw_log': 'echo 4',
}, },
'variables': { 'variables': {
@@ -542,10 +542,10 @@ profiles:
}, },
'a_log': 'echo a', 'a_log': 'echo a',
}, },
'trans': { 'trans_install': {
't_log': 'echo b', 't_log': 'echo b',
}, },
'trans_write': { 'trans_update': {
'tw_log': 'echo c', 'tw_log': 'echo c',
}, },
'variables': { 'variables': {
@@ -605,12 +605,12 @@ profiles:
# test transactions # test transactions
self.assertFalse(any( self.assertFalse(any(
imported_cfg.trans_r[key] == importing_cfg.trans_r[key] imported_cfg.trans_install[key] == importing_cfg.trans_install[key]
for key in imported_cfg.trans_r for key in imported_cfg.trans_install
)) ))
self.assertFalse(any( self.assertFalse(any(
imported_cfg.trans_w[key] == importing_cfg.trans_w[key] imported_cfg.trans_update[key] == importing_cfg.trans_update[key]
for key in imported_cfg.trans_w for key in imported_cfg.trans_update
)) ))
# test variables # test variables