forked from github-starred/komodo
1.17.0 (#248)
* resolver v3
add new ec2 instance types
clean up testing config
document the libraries a bit
clean up main
update sysinfo and otel
update client resolver 3.0
resolver v3 prog
clean up gitignore
implement periphery resolver v3
clean up
core read api v3
more prog
execute api
missing apis
compiling
1.16.13
work on more granular traits
prog on crud
* fmt
* format
* resource2 not really a benefit
* axum to 0.8
* bump aws deps
* just make it 1.17.0
* clean up cors
* the komodo env file should be highest priority over additional files
* add entities / message for test alerter
* test alert implementation
* rust 1.84.0
* axum update :param to {param} syntax
* fix last axum updates
* Add test alerter button
* higher quality / colored icons
* komodo-logo
* simplify network stats
* rename Test Alerter button
* escape incoming sync backslashes (BREAKING)
* clean up rust client websocket subscription
* finish oidc comment
* show update available stack table
* update available deployment table
* feature: use the repo path instead of name in GetLatestCommit (#282)
* Update repo path handling in commit fetching
- Changed `name` to `path` for repository identification.
- Updated cache update function to use the new path field.
- Improved error message for non-directory repo paths.
* feat: use optional name and path in GetLatestCommit
* review: don't use optional for name
* review: use helper
* review: remove redundant to_string()
* 1.17.0-dev
* feature: add post_deploy command (#288)
* feature: add post_deploy command
* review: do not run post_deploy if deploy failed
* feature: interpolate secrets in custom alerter (#289)
* feature: interpolate secrets in custom alerter
* fix rust warning
* review: sanitize errors
* review: sanitize error message
* Remove .git from remote_url (#299)
Remove .git from remote_url
Co-authored-by: Deon Marshall <dmarshall@ccp.com.au>
* mbecker20 -> moghtech
* remove example from cargo toml workspace
* dev-1
* fix login screen logo
* more legible favicon
* fix new compose images
* docs new organization
* typescript subscribe_to_update_websocket
* add donate button docsite
* add config save button in desktop sidebar navigator
* add save button to config bottom
* feature: allow docker image text to overflow in table (#301)
* feature: allow docker image text to overflow in table
* review: use break-words
* wip: revert line break in css file
* feature: update devcontainer node release
* improve First Login docs
* FIx PullStack re #302 and record docker compose config on stack deploy
* requery alerts more often
* improve update indicator style and also put on home screen
* Add all services stack log
* 1.17.0-dev-2
* fix api name chnage
* choose which stack services to include in logs
* feature: improve tables quick actions on mobile (#312)
* feature: improve tables quick actions on mobile
* review: fix gap4
* review: use flex-wrap
* improve pull to git init on existing folder without .git
* Fix unclear ComposePull log re #244
* use komodo_client.subscribe_to_update_websocket, and click indicator to reconnect
* dev-3
* ServerTemplate description
* improve WriteComposeContentsToHost instrument fields
* give server stat charts labels
* filters wrap
* show provider usernames from config file
* Stack: Fix git repo new compose file initialization
* init sync file new repo
* set branch on git init folder
* ResourceSync: pending view toggle between "Execute" vs "Commit" sync direction
* Improve resource sync Execute / Pending view selector
* standardize running commands with interpolation / output sanitizations
* fix all clippy lints
* fix rand
* lock certain users username / password, prevent demo creds from being changed.
* revert to login screen whenever the call to check login fails
* ResourceSync state resolution refinement
* make sure parent directories exist whenever writing files
* don't prune images if server not enabled
* update most deps
* update openidconnect dependency, and use reqwest rustls-tls-native-roots
* dev-4
* resource sync only add escaping on toml between the """
* Stacks executions take list of services -- Auto update only redeploys services with update
* auto update all service deploy option
* dev-5 fix the stack service executions
* clean up service_args
* rust 1.85
* store sync edits on localstorage
* stack edits on localstorage and show last deployed config
* add yarn install to runfile
* Fix actions when core on https
* add update_available query parameter to filter for only stacks /deployments with available update
* rust 2024 and fmt
* rename test.compose.yaml to dev.compose.yaml, and update runfile
* update .devcontainer / dev docs for updated runfile
* use png in topbar logo, svg quality sometimes bad
* OIDC: Support PKCE auth (secret optional)
* update docs on OIDC and client secret
* cycle the oidc client on interval to ensure up to date JWKs
* add KOMODO_LOCK_LOGIN_CREDENTIALS_FOR in config doc
* update deps
* resource sync toggle resource / variable / user group inclusion independantly
* use jsonwebtoken
* improve variable value table overflow
* colored tags
* fix sync summary count ok
* default new tag colors to grey
* soften tag opacity a bit
* Update config.tsx (#358)
* isolate stacks / deployments with pending updates
* update some deps
* use Tooltip component instead of HoverCard for mobile compatibility
* batch Build builds
* link to typescript client in the intro
* add link to main docs from client docs
* doc tweaks
* use moghtech/komodo-core and moghtech/komodo-periphery as images
* remove unnecessary explicit network
* periphery.compose.yaml
* clean up periphery compose
* add link to config
* update periphery container compose config
* rust 1.85.1
* update sync docs
* 1.17.0
---------
Co-authored-by: unsync <1211591+unsync@users.noreply.github.com>
Co-authored-by: Deon Marshall <dmarshall@ccp.com.au>
Co-authored-by: komodo <komodo@komo.do>
Co-authored-by: wlatic <jamesoh@gmail.com>
This commit is contained in:
5
lib/cache/README.md
vendored
Normal file
5
lib/cache/README.md
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
# Cache module
|
||||
|
||||
Contains a thread-safe async timeout cache implementation.
|
||||
Used to cache outputs in memory for a limited time period.
|
||||
Can be used to avoid re-running the underlying process which generated an output in too short a timeframe.
|
||||
11
lib/cache/src/lib.rs
vendored
11
lib/cache/src/lib.rs
vendored
@@ -37,19 +37,12 @@ impl<Res: Default> Default for CacheEntry<Res> {
|
||||
|
||||
impl<Res: Clone> CacheEntry<Res> {
|
||||
pub fn set(&mut self, res: &anyhow::Result<Res>, timestamp: i64) {
|
||||
self.res = res
|
||||
.as_ref()
|
||||
.map(|res| res.clone())
|
||||
.map_err(clone_anyhow_error);
|
||||
self.res = res.as_ref().map_err(clone_anyhow_error).cloned();
|
||||
self.last_ts = timestamp;
|
||||
}
|
||||
|
||||
pub fn clone_res(&self) -> anyhow::Result<Res> {
|
||||
self
|
||||
.res
|
||||
.as_ref()
|
||||
.map(|res| res.clone())
|
||||
.map_err(clone_anyhow_error)
|
||||
self.res.as_ref().map_err(clone_anyhow_error).cloned()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -9,4 +9,7 @@ homepage.workspace = true
|
||||
|
||||
[dependencies]
|
||||
komodo_client.workspace = true
|
||||
run_command.workspace = true
|
||||
run_command.workspace = true
|
||||
formatting.workspace = true
|
||||
anyhow.workspace = true
|
||||
svi.workspace = true
|
||||
3
lib/command/README.md
Normal file
3
lib/command/README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Command module
|
||||
|
||||
Helpers to run shell commands as child processes, and collect the outputs.
|
||||
@@ -1,36 +1,96 @@
|
||||
use std::path::Path;
|
||||
use std::{collections::HashMap, path::Path};
|
||||
|
||||
use anyhow::Context;
|
||||
use formatting::format_serror;
|
||||
use komodo_client::{
|
||||
entities::{komodo_timestamp, update::Log},
|
||||
parsers::parse_multiline_command,
|
||||
};
|
||||
use run_command::{async_run_command, CommandOutput};
|
||||
use run_command::{CommandOutput, async_run_command};
|
||||
use svi::Interpolator;
|
||||
|
||||
/// If `parse_multiline: true`, parses commands out of multiline string
|
||||
/// and chains them together with '&&'.
|
||||
/// Supports full line and end of line comments.
|
||||
/// See [parse_multiline_command].
|
||||
pub async fn run_komodo_command(
|
||||
stage: &str,
|
||||
path: impl Into<Option<&Path>>,
|
||||
command: impl AsRef<str>,
|
||||
parse_multiline: bool,
|
||||
) -> Log {
|
||||
let command = if parse_multiline {
|
||||
parse_multiline_command(command)
|
||||
let command = if let Some(path) = path.into() {
|
||||
format!("cd {} && {}", path.display(), command.as_ref())
|
||||
} else {
|
||||
command.as_ref().to_string()
|
||||
};
|
||||
let command = if let Some(path) = path.into() {
|
||||
format!("cd {} && {command}", path.display(),)
|
||||
} else {
|
||||
command
|
||||
};
|
||||
let start_ts = komodo_timestamp();
|
||||
let output = async_run_command(&command).await;
|
||||
output_into_log(stage, command, start_ts, output)
|
||||
}
|
||||
|
||||
/// Parses commands out of multiline string
|
||||
/// and chains them together with '&&'.
|
||||
/// Supports full line and end of line comments.
|
||||
/// See [parse_multiline_command].
|
||||
///
|
||||
/// The result may be None if the command is empty after parsing,
|
||||
/// ie if all the lines are commented out.
|
||||
pub async fn run_komodo_command_multiline(
|
||||
stage: &str,
|
||||
path: impl Into<Option<&Path>>,
|
||||
command: impl AsRef<str>,
|
||||
) -> Option<Log> {
|
||||
let command = parse_multiline_command(command);
|
||||
if command.is_empty() {
|
||||
return None;
|
||||
}
|
||||
Some(run_komodo_command(stage, path, command).await)
|
||||
}
|
||||
|
||||
/// Interpolates provided secrets into (potentially multiline) command,
|
||||
/// executes the command, and sanitizes the output to avoid exposing the secrets.
|
||||
///
|
||||
/// Checks to make sure the command is non-empty after being multiline-parsed.
|
||||
///
|
||||
/// If `parse_multiline: true`, parses commands out of multiline string
|
||||
/// and chains them together with '&&'.
|
||||
/// Supports full line and end of line comments.
|
||||
/// See [parse_multiline_command].
|
||||
pub async fn run_komodo_command_with_interpolation(
|
||||
stage: &str,
|
||||
path: impl Into<Option<&Path>>,
|
||||
command: impl AsRef<str>,
|
||||
parse_multiline: bool,
|
||||
secrets: &HashMap<String, String>,
|
||||
additional_replacers: &[(String, String)],
|
||||
) -> Option<Log> {
|
||||
let (command, mut replacers) = match svi::interpolate_variables(
|
||||
command.as_ref(),
|
||||
secrets,
|
||||
Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context("Failed to interpolate secrets")
|
||||
{
|
||||
Ok(res) => res,
|
||||
Err(e) => {
|
||||
return Some(Log::error(
|
||||
&format!("{stage} - Interpolate Secrets"),
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
}
|
||||
};
|
||||
let mut log = if parse_multiline {
|
||||
run_komodo_command_multiline(stage, path, command).await
|
||||
} else {
|
||||
run_komodo_command(stage, path, command).await.into()
|
||||
}?;
|
||||
|
||||
// Sanitize the command and output
|
||||
replacers.extend_from_slice(additional_replacers);
|
||||
log.command = svi::replace_in_string(&log.command, &replacers);
|
||||
log.stdout = svi::replace_in_string(&log.stdout, &replacers);
|
||||
log.stderr = svi::replace_in_string(&log.stderr, &replacers);
|
||||
|
||||
Some(log)
|
||||
}
|
||||
|
||||
pub fn output_into_log(
|
||||
stage: &str,
|
||||
command: String,
|
||||
|
||||
8
lib/environment_file/README.md
Normal file
8
lib/environment_file/README.md
Normal file
@@ -0,0 +1,8 @@
|
||||
# Environment file module
|
||||
|
||||
Helpers for parsing variables from file contents.
|
||||
|
||||
Used to parse secrets from the files specified in env variable ending in `_FILE`.
|
||||
|
||||
Compatible with docker compose secrets,
|
||||
see [https://docs.docker.com/compose/how-tos/use-secrets/](https://docs.docker.com/compose/how-tos/use-secrets/).
|
||||
3
lib/formatting/README.md
Normal file
3
lib/formatting/README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Formatting module
|
||||
|
||||
Used to pretty-format logs using HTML for better display from the UI.
|
||||
3
lib/git/README.md
Normal file
3
lib/git/README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Git module
|
||||
|
||||
Helpers for cloning, pulling, and committing to git repos.
|
||||
@@ -1,15 +1,17 @@
|
||||
use std::{collections::HashMap, path::Path};
|
||||
|
||||
use anyhow::Context;
|
||||
use command::run_komodo_command;
|
||||
use command::{
|
||||
run_komodo_command, run_komodo_command_multiline,
|
||||
run_komodo_command_with_interpolation,
|
||||
};
|
||||
use formatting::format_serror;
|
||||
use komodo_client::entities::{
|
||||
all_logs_success, komodo_timestamp, update::Log, CloneArgs,
|
||||
EnvironmentVar,
|
||||
CloneArgs, EnvironmentVar, all_logs_success, komodo_timestamp,
|
||||
update::Log,
|
||||
};
|
||||
use run_command::async_run_command;
|
||||
|
||||
use crate::{get_commit_hash_log, GitRes};
|
||||
use crate::{GitRes, get_commit_hash_log};
|
||||
|
||||
/// Will delete the existing repo folder,
|
||||
/// clone the repo, get the latest hash / message,
|
||||
@@ -72,23 +74,24 @@ where
|
||||
}
|
||||
Err(e) => {
|
||||
logs.push(Log::simple(
|
||||
"latest commit",
|
||||
"Latest Commit",
|
||||
format_serror(
|
||||
&e.context("failed to get latest commit").into(),
|
||||
&e.context("Failed to get latest commit").into(),
|
||||
),
|
||||
));
|
||||
(None, None)
|
||||
}
|
||||
};
|
||||
|
||||
let Ok(env_file_path) = crate::environment::write_file(
|
||||
environment,
|
||||
env_file_path,
|
||||
secrets,
|
||||
&repo_dir,
|
||||
&mut logs,
|
||||
)
|
||||
.await
|
||||
let Ok((env_file_path, _replacers)) =
|
||||
crate::environment::write_file(
|
||||
environment,
|
||||
env_file_path,
|
||||
secrets,
|
||||
&repo_dir,
|
||||
&mut logs,
|
||||
)
|
||||
.await
|
||||
else {
|
||||
return Ok(GitRes {
|
||||
logs,
|
||||
@@ -99,112 +102,50 @@ where
|
||||
};
|
||||
|
||||
if let Some(command) = args.on_clone {
|
||||
if !command.command.is_empty() {
|
||||
let on_clone_path = repo_dir.join(&command.path);
|
||||
if let Some(secrets) = secrets {
|
||||
let (full_command, mut replacers) =
|
||||
svi::interpolate_variables(
|
||||
&command.command,
|
||||
secrets,
|
||||
svi::Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context(
|
||||
"failed to interpolate secrets into on_clone command",
|
||||
)?;
|
||||
replacers.extend(core_replacers.to_owned());
|
||||
let mut on_clone_log = run_komodo_command(
|
||||
"on clone",
|
||||
on_clone_path.as_ref(),
|
||||
full_command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
|
||||
on_clone_log.command =
|
||||
svi::replace_in_string(&on_clone_log.command, &replacers);
|
||||
on_clone_log.stdout =
|
||||
svi::replace_in_string(&on_clone_log.stdout, &replacers);
|
||||
on_clone_log.stderr =
|
||||
svi::replace_in_string(&on_clone_log.stderr, &replacers);
|
||||
|
||||
tracing::debug!(
|
||||
"run repo on_clone command | command: {} | cwd: {:?}",
|
||||
on_clone_log.command,
|
||||
on_clone_path
|
||||
);
|
||||
|
||||
logs.push(on_clone_log);
|
||||
} else {
|
||||
let on_clone_log = run_komodo_command(
|
||||
"on clone",
|
||||
on_clone_path.as_ref(),
|
||||
&command.command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
tracing::debug!(
|
||||
"run repo on_clone command | command: {} | cwd: {:?}",
|
||||
command.command,
|
||||
on_clone_path
|
||||
);
|
||||
logs.push(on_clone_log);
|
||||
}
|
||||
}
|
||||
let on_clone_path = repo_dir.join(&command.path);
|
||||
if let Some(log) = if let Some(secrets) = secrets {
|
||||
run_komodo_command_with_interpolation(
|
||||
"On Clone",
|
||||
Some(on_clone_path.as_path()),
|
||||
&command.command,
|
||||
true,
|
||||
secrets,
|
||||
core_replacers,
|
||||
)
|
||||
.await
|
||||
} else {
|
||||
run_komodo_command_multiline(
|
||||
"On Clone",
|
||||
Some(on_clone_path.as_path()),
|
||||
&command.command,
|
||||
)
|
||||
.await
|
||||
} {
|
||||
logs.push(log)
|
||||
};
|
||||
}
|
||||
if let Some(command) = args.on_pull {
|
||||
if !command.command.is_empty() {
|
||||
let on_pull_path = repo_dir.join(&command.path);
|
||||
if let Some(secrets) = secrets {
|
||||
let (full_command, mut replacers) =
|
||||
svi::interpolate_variables(
|
||||
&command.command,
|
||||
secrets,
|
||||
svi::Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context(
|
||||
"failed to interpolate secrets into on_pull command",
|
||||
)?;
|
||||
replacers.extend(core_replacers.to_owned());
|
||||
let mut on_pull_log = run_komodo_command(
|
||||
"on pull",
|
||||
on_pull_path.as_ref(),
|
||||
&full_command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
|
||||
on_pull_log.command =
|
||||
svi::replace_in_string(&on_pull_log.command, &replacers);
|
||||
on_pull_log.stdout =
|
||||
svi::replace_in_string(&on_pull_log.stdout, &replacers);
|
||||
on_pull_log.stderr =
|
||||
svi::replace_in_string(&on_pull_log.stderr, &replacers);
|
||||
|
||||
tracing::debug!(
|
||||
"run repo on_pull command | command: {} | cwd: {:?}",
|
||||
on_pull_log.command,
|
||||
on_pull_path
|
||||
);
|
||||
|
||||
logs.push(on_pull_log);
|
||||
} else {
|
||||
let on_pull_log = run_komodo_command(
|
||||
"on pull",
|
||||
on_pull_path.as_ref(),
|
||||
&command.command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
tracing::debug!(
|
||||
"run repo on_pull command | command: {} | cwd: {:?}",
|
||||
command.command,
|
||||
on_pull_path
|
||||
);
|
||||
logs.push(on_pull_log);
|
||||
}
|
||||
}
|
||||
let on_pull_path = repo_dir.join(&command.path);
|
||||
if let Some(log) = if let Some(secrets) = secrets {
|
||||
run_komodo_command_with_interpolation(
|
||||
"On Pull",
|
||||
Some(on_pull_path.as_path()),
|
||||
&command.command,
|
||||
true,
|
||||
secrets,
|
||||
core_replacers,
|
||||
)
|
||||
.await
|
||||
} else {
|
||||
run_komodo_command_multiline(
|
||||
"On Pull",
|
||||
Some(on_pull_path.as_path()),
|
||||
&command.command,
|
||||
)
|
||||
.await
|
||||
} {
|
||||
logs.push(log)
|
||||
};
|
||||
}
|
||||
|
||||
Ok(GitRes {
|
||||
@@ -257,7 +198,6 @@ async fn clone_inner(
|
||||
"set commit",
|
||||
destination,
|
||||
format!("git reset --hard {commit}",),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
logs.push(reset_log);
|
||||
|
||||
@@ -7,7 +7,7 @@ use komodo_client::entities::{all_logs_success, update::Log};
|
||||
use run_command::async_run_command;
|
||||
use tokio::fs;
|
||||
|
||||
use crate::{get_commit_hash_log, GitRes};
|
||||
use crate::{GitRes, get_commit_hash_log};
|
||||
|
||||
/// Write file, add, commit, force push.
|
||||
/// Repo must be cloned.
|
||||
@@ -17,12 +17,15 @@ pub async fn write_commit_file(
|
||||
// relative to repo root
|
||||
file: &Path,
|
||||
contents: &str,
|
||||
branch: &str,
|
||||
) -> anyhow::Result<GitRes> {
|
||||
// Clean up the path by stripping any redundant `/./`
|
||||
let path = repo_dir.join(file).components().collect::<PathBuf>();
|
||||
|
||||
if let Some(parent) = path.parent() {
|
||||
let _ = fs::create_dir_all(&parent).await;
|
||||
fs::create_dir_all(parent).await.with_context(|| {
|
||||
format!("Failed to initialize file parent directory {parent:?}")
|
||||
})?;
|
||||
}
|
||||
|
||||
fs::write(&path, contents).await.with_context(|| {
|
||||
@@ -35,7 +38,8 @@ pub async fn write_commit_file(
|
||||
format!("File contents written to {path:?}"),
|
||||
));
|
||||
|
||||
commit_file_inner(commit_msg, &mut res, repo_dir, file).await;
|
||||
commit_file_inner(commit_msg, &mut res, repo_dir, file, branch)
|
||||
.await;
|
||||
|
||||
Ok(res)
|
||||
}
|
||||
@@ -47,9 +51,11 @@ pub async fn commit_file(
|
||||
repo_dir: &Path,
|
||||
// relative to repo root
|
||||
file: &Path,
|
||||
branch: &str,
|
||||
) -> GitRes {
|
||||
let mut res = GitRes::default();
|
||||
commit_file_inner(commit_msg, &mut res, repo_dir, file).await;
|
||||
commit_file_inner(commit_msg, &mut res, repo_dir, file, branch)
|
||||
.await;
|
||||
res
|
||||
}
|
||||
|
||||
@@ -59,14 +65,14 @@ pub async fn commit_file_inner(
|
||||
repo_dir: &Path,
|
||||
// relative to repo root
|
||||
file: &Path,
|
||||
branch: &str,
|
||||
) {
|
||||
ensure_global_git_config_set().await;
|
||||
|
||||
let add_log = run_komodo_command(
|
||||
"add files",
|
||||
"Add Files",
|
||||
repo_dir,
|
||||
format!("git add {}", file.display()),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
res.logs.push(add_log);
|
||||
@@ -75,17 +81,22 @@ pub async fn commit_file_inner(
|
||||
}
|
||||
|
||||
let commit_log = run_komodo_command(
|
||||
"commit",
|
||||
"Commit",
|
||||
repo_dir,
|
||||
format!(
|
||||
"git commit -m \"[Komodo] {commit_msg}: update {file:?}\"",
|
||||
),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
res.logs.push(commit_log);
|
||||
if !all_logs_success(&res.logs) {
|
||||
return;
|
||||
|
||||
if !commit_log.success {
|
||||
// The user may have nothing to commit, but still should continue push the changes
|
||||
if !commit_log.stdout.contains("nothing to commit") {
|
||||
res.logs.push(commit_log);
|
||||
return;
|
||||
}
|
||||
} else {
|
||||
res.logs.push(commit_log);
|
||||
}
|
||||
|
||||
match get_commit_hash_log(repo_dir).await {
|
||||
@@ -96,38 +107,44 @@ pub async fn commit_file_inner(
|
||||
}
|
||||
Err(e) => {
|
||||
res.logs.push(Log::error(
|
||||
"get commit hash",
|
||||
"Get commit hash",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
let push_log =
|
||||
run_komodo_command("push", repo_dir, "git push -f", false).await;
|
||||
let push_log = run_komodo_command(
|
||||
"Push",
|
||||
repo_dir,
|
||||
format!("git push -f --set-upstream origin {branch}"),
|
||||
)
|
||||
.await;
|
||||
res.logs.push(push_log);
|
||||
}
|
||||
|
||||
/// Add, commit, and force push.
|
||||
/// Repo must be cloned.
|
||||
pub async fn commit_all(repo_dir: &Path, message: &str) -> GitRes {
|
||||
pub async fn commit_all(
|
||||
repo_dir: &Path,
|
||||
message: &str,
|
||||
branch: &str,
|
||||
) -> GitRes {
|
||||
ensure_global_git_config_set().await;
|
||||
|
||||
let mut res = GitRes::default();
|
||||
|
||||
let add_log =
|
||||
run_komodo_command("add files", repo_dir, "git add -A", false)
|
||||
.await;
|
||||
run_komodo_command("Add Files", repo_dir, "git add -A").await;
|
||||
res.logs.push(add_log);
|
||||
if !all_logs_success(&res.logs) {
|
||||
return res;
|
||||
}
|
||||
|
||||
let commit_log = run_komodo_command(
|
||||
"commit",
|
||||
"Commit",
|
||||
repo_dir,
|
||||
format!("git commit -m \"[Komodo] {message}\""),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
res.logs.push(commit_log);
|
||||
@@ -143,15 +160,19 @@ pub async fn commit_all(repo_dir: &Path, message: &str) -> GitRes {
|
||||
}
|
||||
Err(e) => {
|
||||
res.logs.push(Log::error(
|
||||
"get commit hash",
|
||||
"Get commit hash",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return res;
|
||||
}
|
||||
};
|
||||
|
||||
let push_log =
|
||||
run_komodo_command("push", repo_dir, "git push -f", false).await;
|
||||
let push_log = run_komodo_command(
|
||||
"Push",
|
||||
repo_dir,
|
||||
format!("git push -f --set-upstream origin {branch}"),
|
||||
)
|
||||
.await;
|
||||
res.logs.push(push_log);
|
||||
|
||||
res
|
||||
|
||||
@@ -5,17 +5,114 @@ use std::{
|
||||
|
||||
use anyhow::Context;
|
||||
use formatting::format_serror;
|
||||
use komodo_client::entities::{update::Log, EnvironmentVar};
|
||||
use komodo_client::entities::{EnvironmentVar, update::Log};
|
||||
|
||||
/// If the environment was written and needs to be passed to the compose command,
|
||||
/// will return the env file PathBuf
|
||||
/// will return the env file PathBuf.
|
||||
/// If variables were interpolated, will also return the sanitizing replacers.
|
||||
pub async fn write_file(
|
||||
environment: &[EnvironmentVar],
|
||||
env_file_path: &str,
|
||||
secrets: Option<&HashMap<String, String>>,
|
||||
folder: &Path,
|
||||
logs: &mut Vec<Log>,
|
||||
) -> Result<Option<PathBuf>, ()> {
|
||||
) -> Result<(Option<PathBuf>, Option<Vec<(String, String)>>), ()> {
|
||||
let env_file_path = folder.join(env_file_path);
|
||||
|
||||
if environment.is_empty() {
|
||||
// Still want to return Some(env_file_path) if the path
|
||||
// already exists on the host and is a file.
|
||||
// This is for "Files on Server" mode when user writes the env file themself.
|
||||
if env_file_path.is_file() {
|
||||
return Ok((Some(env_file_path), None));
|
||||
}
|
||||
return Ok((None, None));
|
||||
}
|
||||
|
||||
let contents = environment
|
||||
.iter()
|
||||
.map(|env| format!("{}={}", env.variable, env.value))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let (contents, replacers) = if let Some(secrets) = secrets {
|
||||
let res = svi::interpolate_variables(
|
||||
&contents,
|
||||
secrets,
|
||||
svi::Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context("failed to interpolate secrets into environment");
|
||||
|
||||
let (contents, replacers) = match res {
|
||||
Ok(res) => res,
|
||||
Err(e) => {
|
||||
logs.push(Log::error(
|
||||
"Interpolate - Environment",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return Err(());
|
||||
}
|
||||
};
|
||||
|
||||
if !replacers.is_empty() {
|
||||
logs.push(Log::simple(
|
||||
"Interpolate - Environment",
|
||||
replacers
|
||||
.iter()
|
||||
.map(|(_, variable)| format!("<span class=\"text-muted-foreground\">replaced:</span> {variable}"))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n"),
|
||||
))
|
||||
}
|
||||
|
||||
(contents, Some(replacers))
|
||||
} else {
|
||||
(contents, None)
|
||||
};
|
||||
|
||||
if let Some(parent) = env_file_path.parent() {
|
||||
if let Err(e) = tokio::fs::create_dir_all(parent)
|
||||
.await
|
||||
.with_context(|| format!("Failed to initialize environment file parent directory {parent:?}"))
|
||||
{
|
||||
logs.push(Log::error(
|
||||
"Write Environment File",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return Err(());
|
||||
}
|
||||
}
|
||||
|
||||
if let Err(e) = tokio::fs::write(&env_file_path, contents)
|
||||
.await
|
||||
.with_context(|| {
|
||||
format!("Failed to write environment file to {env_file_path:?}")
|
||||
})
|
||||
{
|
||||
logs.push(Log::error(
|
||||
"Write Environment File",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return Err(());
|
||||
}
|
||||
|
||||
logs.push(Log::simple(
|
||||
"Write Environment File",
|
||||
format!("Environment file written to {env_file_path:?}"),
|
||||
));
|
||||
|
||||
Ok((Some(env_file_path), replacers))
|
||||
}
|
||||
|
||||
///
|
||||
/// Will return the env file PathBuf.
|
||||
pub async fn write_file_simple(
|
||||
environment: &[EnvironmentVar],
|
||||
env_file_path: &str,
|
||||
folder: &Path,
|
||||
logs: &mut Vec<Log>,
|
||||
) -> anyhow::Result<Option<PathBuf>> {
|
||||
let env_file_path = folder.join(env_file_path);
|
||||
|
||||
if environment.is_empty() {
|
||||
@@ -34,58 +131,35 @@ pub async fn write_file(
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n");
|
||||
|
||||
let contents = if let Some(secrets) = secrets {
|
||||
let res = svi::interpolate_variables(
|
||||
&contents,
|
||||
secrets,
|
||||
svi::Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context("failed to interpolate secrets into environment");
|
||||
|
||||
let (contents, replacers) = match res {
|
||||
Ok(res) => res,
|
||||
Err(e) => {
|
||||
logs.push(Log::error(
|
||||
"interpolate periphery secrets",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return Err(());
|
||||
}
|
||||
};
|
||||
|
||||
if !replacers.is_empty() {
|
||||
logs.push(Log::simple(
|
||||
"Interpolate - Environment",
|
||||
replacers
|
||||
.iter()
|
||||
.map(|(_, variable)| format!("<span class=\"text-muted-foreground\">replaced:</span> {variable}"))
|
||||
.collect::<Vec<_>>()
|
||||
.join("\n"),
|
||||
))
|
||||
if let Some(parent) = env_file_path.parent() {
|
||||
if let Err(e) = tokio::fs::create_dir_all(parent)
|
||||
.await
|
||||
.with_context(|| format!("Failed to initialize environment file parent directory {parent:?}"))
|
||||
{
|
||||
logs.push(Log::error(
|
||||
"Write Environment File",
|
||||
format_serror(&(&e).into()),
|
||||
));
|
||||
return Err(e);
|
||||
}
|
||||
|
||||
contents
|
||||
} else {
|
||||
contents
|
||||
};
|
||||
}
|
||||
|
||||
if let Err(e) = tokio::fs::write(&env_file_path, contents)
|
||||
.await
|
||||
.with_context(|| {
|
||||
format!("failed to write environment file to {env_file_path:?}")
|
||||
format!("Failed to write environment file to {env_file_path:?}")
|
||||
})
|
||||
{
|
||||
logs.push(Log::error(
|
||||
"write environment file",
|
||||
format_serror(&e.into()),
|
||||
"Write Environment file",
|
||||
format_serror(&(&e).into()),
|
||||
));
|
||||
return Err(());
|
||||
return Err(e);
|
||||
}
|
||||
|
||||
logs.push(Log::simple(
|
||||
"write environment file",
|
||||
format!("environment written to {env_file_path:?}"),
|
||||
"Write Environment File",
|
||||
format!("Environment written to {env_file_path:?}"),
|
||||
));
|
||||
|
||||
Ok(Some(env_file_path))
|
||||
|
||||
61
lib/git/src/init.rs
Normal file
61
lib/git/src/init.rs
Normal file
@@ -0,0 +1,61 @@
|
||||
use std::path::Path;
|
||||
|
||||
use command::run_komodo_command;
|
||||
use formatting::format_serror;
|
||||
use komodo_client::entities::{
|
||||
CloneArgs, all_logs_success, update::Log,
|
||||
};
|
||||
|
||||
pub async fn init_folder_as_repo(
|
||||
folder_path: &Path,
|
||||
args: &CloneArgs,
|
||||
access_token: Option<&str>,
|
||||
logs: &mut Vec<Log>,
|
||||
) {
|
||||
// let folder_path = args.path(repo_dir);
|
||||
// Initialize the folder as a git repo
|
||||
let init_repo =
|
||||
run_komodo_command("Git Init", folder_path, "git init").await;
|
||||
logs.push(init_repo);
|
||||
if !all_logs_success(logs) {
|
||||
return;
|
||||
}
|
||||
|
||||
let repo_url = match args.remote_url(access_token) {
|
||||
Ok(url) => url,
|
||||
Err(e) => {
|
||||
logs
|
||||
.push(Log::error("Add git remote", format_serror(&e.into())));
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
// Set remote url
|
||||
let mut set_remote = run_komodo_command(
|
||||
"Add git remote",
|
||||
folder_path,
|
||||
format!("git remote add origin {repo_url}"),
|
||||
)
|
||||
.await;
|
||||
// Sanitize the output
|
||||
if let Some(token) = &access_token {
|
||||
set_remote.command = set_remote.command.replace(token, "<TOKEN>");
|
||||
set_remote.stdout = set_remote.stdout.replace(token, "<TOKEN>");
|
||||
set_remote.stderr = set_remote.stderr.replace(token, "<TOKEN>");
|
||||
}
|
||||
if !set_remote.success {
|
||||
logs.push(set_remote);
|
||||
return;
|
||||
}
|
||||
|
||||
// Set branch.
|
||||
let init_repo = run_komodo_command(
|
||||
"Set Branch",
|
||||
folder_path,
|
||||
format!("git switch -c {}", args.branch),
|
||||
)
|
||||
.await;
|
||||
if !init_repo.success {
|
||||
logs.push(init_repo);
|
||||
}
|
||||
}
|
||||
@@ -1,9 +1,9 @@
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::{anyhow, Context};
|
||||
use anyhow::{Context, anyhow};
|
||||
use formatting::{bold, muted};
|
||||
use komodo_client::entities::{
|
||||
komodo_timestamp, update::Log, LatestCommit,
|
||||
LatestCommit, komodo_timestamp, update::Log,
|
||||
};
|
||||
use run_command::async_run_command;
|
||||
use tracing::instrument;
|
||||
@@ -12,13 +12,17 @@ pub mod environment;
|
||||
|
||||
mod clone;
|
||||
mod commit;
|
||||
mod init;
|
||||
mod pull;
|
||||
mod pull_or_clone;
|
||||
|
||||
pub use clone::clone;
|
||||
pub use commit::{commit_all, commit_file, write_commit_file};
|
||||
pub use pull::pull;
|
||||
pub use pull_or_clone::pull_or_clone;
|
||||
pub use crate::{
|
||||
clone::clone,
|
||||
commit::{commit_all, commit_file, write_commit_file},
|
||||
init::init_folder_as_repo,
|
||||
pull::pull,
|
||||
pull_or_clone::pull_or_clone,
|
||||
};
|
||||
|
||||
#[derive(Debug, Default, Clone)]
|
||||
pub struct GitRes {
|
||||
@@ -32,7 +36,10 @@ pub struct GitRes {
|
||||
pub async fn get_commit_hash_info(
|
||||
repo_dir: &Path,
|
||||
) -> anyhow::Result<LatestCommit> {
|
||||
let command = format!("cd {} && git rev-parse --short HEAD && git rev-parse HEAD && git log -1 --pretty=%B", repo_dir.display());
|
||||
let command = format!(
|
||||
"cd {} && git rev-parse --short HEAD && git rev-parse HEAD && git log -1 --pretty=%B",
|
||||
repo_dir.display()
|
||||
);
|
||||
let output = async_run_command(&command).await;
|
||||
let mut split = output.stdout.split('\n');
|
||||
let (hash, _, message) = (
|
||||
@@ -54,7 +61,10 @@ pub async fn get_commit_hash_log(
|
||||
repo_dir: &Path,
|
||||
) -> anyhow::Result<(Log, String, String)> {
|
||||
let start_ts = komodo_timestamp();
|
||||
let command = format!("cd {} && git rev-parse --short HEAD && git rev-parse HEAD && git log -1 --pretty=%B", repo_dir.display());
|
||||
let command = format!(
|
||||
"cd {} && git rev-parse --short HEAD && git rev-parse HEAD && git log -1 --pretty=%B",
|
||||
repo_dir.display()
|
||||
);
|
||||
let output = async_run_command(&command).await;
|
||||
let mut split = output.stdout.split('\n');
|
||||
let (short_hash, _, msg) = (
|
||||
@@ -69,7 +79,7 @@ pub async fn get_commit_hash_log(
|
||||
.to_string(),
|
||||
);
|
||||
let log = Log {
|
||||
stage: "latest commit".into(),
|
||||
stage: "Latest Commit".into(),
|
||||
command,
|
||||
stdout: format!(
|
||||
"{} {}\n{} {}",
|
||||
|
||||
@@ -4,15 +4,18 @@ use std::{
|
||||
sync::OnceLock,
|
||||
};
|
||||
|
||||
use anyhow::Context;
|
||||
use cache::TimeoutCache;
|
||||
use command::run_komodo_command;
|
||||
use command::{
|
||||
run_komodo_command, run_komodo_command_multiline,
|
||||
run_komodo_command_with_interpolation,
|
||||
};
|
||||
use formatting::format_serror;
|
||||
use komodo_client::entities::{
|
||||
komodo_timestamp, update::Log, CloneArgs, EnvironmentVar,
|
||||
CloneArgs, EnvironmentVar, all_logs_success, komodo_timestamp,
|
||||
update::Log,
|
||||
};
|
||||
|
||||
use crate::{get_commit_hash_log, GitRes};
|
||||
use crate::{GitRes, get_commit_hash_log};
|
||||
|
||||
/// Wait this long after a pull to allow another pull through
|
||||
const PULL_TIMEOUT: i64 = 5_000;
|
||||
@@ -51,10 +54,10 @@ where
|
||||
T: Into<CloneArgs> + std::fmt::Debug,
|
||||
{
|
||||
let args: CloneArgs = clone_args.into();
|
||||
let path = args.path(repo_dir);
|
||||
let folder_path = args.path(repo_dir);
|
||||
|
||||
// Acquire the path lock
|
||||
let lock = pull_cache().get_lock(path.clone()).await;
|
||||
let lock = pull_cache().get_lock(folder_path.clone()).await;
|
||||
|
||||
// Lock the path lock, prevents simultaneous pulls by
|
||||
// ensuring simultaneous pulls will wait for first to finish
|
||||
@@ -67,28 +70,50 @@ where
|
||||
}
|
||||
|
||||
let res = async {
|
||||
let mut logs = Vec::new();
|
||||
|
||||
// Check for '.git' path to see if the folder is initialized as a git repo
|
||||
let dot_git_path = folder_path.join(".git");
|
||||
if !dot_git_path.exists() {
|
||||
crate::init::init_folder_as_repo(
|
||||
&folder_path,
|
||||
&args,
|
||||
access_token.as_deref(),
|
||||
&mut logs,
|
||||
)
|
||||
.await;
|
||||
if !all_logs_success(&logs) {
|
||||
return Ok(GitRes {
|
||||
logs,
|
||||
hash: None,
|
||||
message: None,
|
||||
env_file_path: None,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
let repo_url = args.remote_url(access_token.as_deref())?;
|
||||
|
||||
// Set remote url
|
||||
let mut set_remote = run_komodo_command(
|
||||
"set git remote",
|
||||
path.as_ref(),
|
||||
"Set git remote",
|
||||
folder_path.as_ref(),
|
||||
format!("git remote set-url origin {repo_url}"),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
|
||||
if !set_remote.success {
|
||||
if let Some(token) = access_token {
|
||||
set_remote.command =
|
||||
set_remote.command.replace(&token, "<TOKEN>");
|
||||
set_remote.stdout =
|
||||
set_remote.stdout.replace(&token, "<TOKEN>");
|
||||
set_remote.stderr =
|
||||
set_remote.stderr.replace(&token, "<TOKEN>");
|
||||
}
|
||||
// Sanitize the output
|
||||
if let Some(token) = access_token {
|
||||
set_remote.command =
|
||||
set_remote.command.replace(&token, "<TOKEN>");
|
||||
set_remote.stdout =
|
||||
set_remote.stdout.replace(&token, "<TOKEN>");
|
||||
set_remote.stderr =
|
||||
set_remote.stderr.replace(&token, "<TOKEN>");
|
||||
}
|
||||
logs.push(set_remote);
|
||||
if !all_logs_success(&logs) {
|
||||
return Ok(GitRes {
|
||||
logs: vec![set_remote],
|
||||
logs,
|
||||
hash: None,
|
||||
message: None,
|
||||
env_file_path: None,
|
||||
@@ -96,16 +121,15 @@ where
|
||||
}
|
||||
|
||||
let checkout = run_komodo_command(
|
||||
"checkout branch",
|
||||
path.as_ref(),
|
||||
"Checkout branch",
|
||||
folder_path.as_ref(),
|
||||
format!("git checkout -f {}", args.branch),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
|
||||
if !checkout.success {
|
||||
logs.push(checkout);
|
||||
if !all_logs_success(&logs) {
|
||||
return Ok(GitRes {
|
||||
logs: vec![checkout],
|
||||
logs,
|
||||
hash: None,
|
||||
message: None,
|
||||
env_file_path: None,
|
||||
@@ -113,16 +137,13 @@ where
|
||||
}
|
||||
|
||||
let pull_log = run_komodo_command(
|
||||
"git pull",
|
||||
path.as_ref(),
|
||||
"Git pull",
|
||||
folder_path.as_ref(),
|
||||
format!("git pull --rebase --force origin {}", args.branch),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
|
||||
let mut logs = vec![pull_log];
|
||||
|
||||
if !logs[0].success {
|
||||
logs.push(pull_log);
|
||||
if !all_logs_success(&logs) {
|
||||
return Ok(GitRes {
|
||||
logs,
|
||||
hash: None,
|
||||
@@ -133,39 +154,40 @@ where
|
||||
|
||||
if let Some(commit) = args.commit {
|
||||
let reset_log = run_komodo_command(
|
||||
"set commit",
|
||||
path.as_ref(),
|
||||
"Set commit",
|
||||
folder_path.as_ref(),
|
||||
format!("git reset --hard {commit}"),
|
||||
false,
|
||||
)
|
||||
.await;
|
||||
logs.push(reset_log);
|
||||
}
|
||||
|
||||
let (hash, message) = match get_commit_hash_log(&path).await {
|
||||
Ok((log, hash, message)) => {
|
||||
logs.push(log);
|
||||
(Some(hash), Some(message))
|
||||
}
|
||||
Err(e) => {
|
||||
logs.push(Log::simple(
|
||||
"latest commit",
|
||||
format_serror(
|
||||
&e.context("failed to get latest commit").into(),
|
||||
),
|
||||
));
|
||||
(None, None)
|
||||
}
|
||||
};
|
||||
let (hash, message) =
|
||||
match get_commit_hash_log(&folder_path).await {
|
||||
Ok((log, hash, message)) => {
|
||||
logs.push(log);
|
||||
(Some(hash), Some(message))
|
||||
}
|
||||
Err(e) => {
|
||||
logs.push(Log::simple(
|
||||
"Latest Commit",
|
||||
format_serror(
|
||||
&e.context("Failed to get latest commit").into(),
|
||||
),
|
||||
));
|
||||
(None, None)
|
||||
}
|
||||
};
|
||||
|
||||
let Ok(env_file_path) = crate::environment::write_file(
|
||||
environment,
|
||||
env_file_path,
|
||||
secrets,
|
||||
&path,
|
||||
&mut logs,
|
||||
)
|
||||
.await
|
||||
let Ok((env_file_path, _replacers)) =
|
||||
crate::environment::write_file(
|
||||
environment,
|
||||
env_file_path,
|
||||
secrets,
|
||||
&folder_path,
|
||||
&mut logs,
|
||||
)
|
||||
.await
|
||||
else {
|
||||
return Ok(GitRes {
|
||||
logs,
|
||||
@@ -176,72 +198,27 @@ where
|
||||
};
|
||||
|
||||
if let Some(command) = args.on_pull {
|
||||
if !command.command.is_empty() {
|
||||
let on_pull_path = path.join(&command.path);
|
||||
if let Some(secrets) = secrets {
|
||||
let (full_command, mut replacers) =
|
||||
match svi::interpolate_variables(
|
||||
&command.command,
|
||||
secrets,
|
||||
svi::Interpolator::DoubleBrackets,
|
||||
true,
|
||||
)
|
||||
.context(
|
||||
"failed to interpolate secrets into on_pull command",
|
||||
) {
|
||||
Ok(res) => res,
|
||||
Err(e) => {
|
||||
logs.push(Log::error(
|
||||
"interpolate secrets - on_pull",
|
||||
format_serror(&e.into()),
|
||||
));
|
||||
return Ok(GitRes {
|
||||
logs,
|
||||
hash,
|
||||
message,
|
||||
env_file_path: None,
|
||||
});
|
||||
}
|
||||
};
|
||||
replacers.extend(core_replacers.to_owned());
|
||||
let mut on_pull_log = run_komodo_command(
|
||||
"on pull",
|
||||
on_pull_path.as_ref(),
|
||||
&full_command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
|
||||
on_pull_log.command =
|
||||
svi::replace_in_string(&on_pull_log.command, &replacers);
|
||||
on_pull_log.stdout =
|
||||
svi::replace_in_string(&on_pull_log.stdout, &replacers);
|
||||
on_pull_log.stderr =
|
||||
svi::replace_in_string(&on_pull_log.stderr, &replacers);
|
||||
|
||||
tracing::debug!(
|
||||
"run repo on_pull command | command: {} | cwd: {:?}",
|
||||
on_pull_log.command,
|
||||
on_pull_path
|
||||
);
|
||||
|
||||
logs.push(on_pull_log);
|
||||
} else {
|
||||
let on_pull_log = run_komodo_command(
|
||||
"on pull",
|
||||
on_pull_path.as_ref(),
|
||||
&command.command,
|
||||
true,
|
||||
)
|
||||
.await;
|
||||
tracing::debug!(
|
||||
"run repo on_pull command | command: {} | cwd: {:?}",
|
||||
command.command,
|
||||
on_pull_path
|
||||
);
|
||||
logs.push(on_pull_log);
|
||||
}
|
||||
}
|
||||
let on_pull_path = repo_dir.join(&command.path);
|
||||
if let Some(log) = if let Some(secrets) = secrets {
|
||||
run_komodo_command_with_interpolation(
|
||||
"On Pull",
|
||||
Some(on_pull_path.as_path()),
|
||||
&command.command,
|
||||
true,
|
||||
secrets,
|
||||
core_replacers,
|
||||
)
|
||||
.await
|
||||
} else {
|
||||
run_komodo_command_multiline(
|
||||
"On Pull",
|
||||
Some(on_pull_path.as_path()),
|
||||
&command.command,
|
||||
)
|
||||
.await
|
||||
} {
|
||||
logs.push(log)
|
||||
};
|
||||
}
|
||||
|
||||
anyhow::Ok(GitRes {
|
||||
|
||||
@@ -33,9 +33,9 @@ where
|
||||
T: Into<CloneArgs> + std::fmt::Debug,
|
||||
{
|
||||
let args: CloneArgs = clone_args.into();
|
||||
let path = args.path(repo_dir);
|
||||
let folder_path = args.path(repo_dir);
|
||||
|
||||
if path.exists() {
|
||||
if folder_path.exists() {
|
||||
crate::pull(
|
||||
args,
|
||||
repo_dir,
|
||||
|
||||
3
lib/logger/README.md
Normal file
3
lib/logger/README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# Logger module
|
||||
|
||||
Helpers to configure standardized application logging / opentelemetry output.
|
||||
@@ -3,7 +3,7 @@ use komodo_client::entities::logger::{LogConfig, StdioLogMode};
|
||||
use tracing::level_filters::LevelFilter;
|
||||
use tracing_opentelemetry::OpenTelemetryLayer;
|
||||
use tracing_subscriber::{
|
||||
layer::SubscriberExt, util::SubscriberInitExt, Registry,
|
||||
Registry, layer::SubscriberExt, util::SubscriberInitExt,
|
||||
};
|
||||
|
||||
mod otel;
|
||||
|
||||
@@ -1,43 +1,37 @@
|
||||
use std::time::Duration;
|
||||
|
||||
use opentelemetry::{global, trace::TracerProvider, KeyValue};
|
||||
use opentelemetry::{KeyValue, global, trace::TracerProvider};
|
||||
use opentelemetry_otlp::WithExportConfig;
|
||||
use opentelemetry_sdk::{
|
||||
trace::{Sampler, Tracer},
|
||||
Resource,
|
||||
trace::{Sampler, Tracer},
|
||||
};
|
||||
use opentelemetry_semantic_conventions::{
|
||||
resource::{SERVICE_NAME, SERVICE_VERSION},
|
||||
SCHEMA_URL,
|
||||
};
|
||||
use opentelemetry_semantic_conventions::resource::SERVICE_VERSION;
|
||||
|
||||
fn resource(service_name: String) -> Resource {
|
||||
Resource::from_schema_url(
|
||||
[
|
||||
KeyValue::new(SERVICE_NAME, service_name),
|
||||
KeyValue::new(SERVICE_VERSION, env!("CARGO_PKG_VERSION")),
|
||||
],
|
||||
SCHEMA_URL,
|
||||
)
|
||||
Resource::builder()
|
||||
.with_service_name(service_name)
|
||||
.with_attribute(KeyValue::new(
|
||||
SERVICE_VERSION,
|
||||
env!("CARGO_PKG_VERSION"),
|
||||
))
|
||||
.build()
|
||||
}
|
||||
|
||||
pub fn tracer(endpoint: &str, service_name: String) -> Tracer {
|
||||
let provider = opentelemetry_sdk::trace::TracerProvider::builder()
|
||||
.with_config(
|
||||
opentelemetry_sdk::trace::Config::default()
|
||||
.with_sampler(Sampler::AlwaysOn)
|
||||
.with_resource(resource(service_name.clone())),
|
||||
)
|
||||
.with_batch_exporter(
|
||||
opentelemetry_otlp::SpanExporter::builder()
|
||||
.with_tonic()
|
||||
.with_endpoint(endpoint)
|
||||
.with_timeout(Duration::from_secs(3))
|
||||
.build()
|
||||
.unwrap(),
|
||||
opentelemetry_sdk::runtime::Tokio,
|
||||
)
|
||||
.build();
|
||||
let provider =
|
||||
opentelemetry_sdk::trace::TracerProviderBuilder::default()
|
||||
.with_resource(resource(service_name.clone()))
|
||||
.with_sampler(Sampler::AlwaysOn)
|
||||
.with_batch_exporter(
|
||||
opentelemetry_otlp::SpanExporter::builder()
|
||||
.with_http()
|
||||
.with_endpoint(endpoint)
|
||||
.with_timeout(Duration::from_secs(3))
|
||||
.build()
|
||||
.unwrap(),
|
||||
)
|
||||
.build();
|
||||
global::set_tracer_provider(provider.clone());
|
||||
provider.tracer(service_name)
|
||||
}
|
||||
|
||||
15
lib/response/Cargo.toml
Normal file
15
lib/response/Cargo.toml
Normal file
@@ -0,0 +1,15 @@
|
||||
[package]
|
||||
name = "response"
|
||||
version.workspace = true
|
||||
edition.workspace = true
|
||||
authors.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
homepage.workspace = true
|
||||
|
||||
[dependencies]
|
||||
serde_json.workspace = true
|
||||
serror.workspace = true
|
||||
anyhow.workspace = true
|
||||
serde.workspace = true
|
||||
axum.workspace = true
|
||||
76
lib/response/src/lib.rs
Normal file
76
lib/response/src/lib.rs
Normal file
@@ -0,0 +1,76 @@
|
||||
use anyhow::Context;
|
||||
use axum::http::{HeaderValue, StatusCode, header::CONTENT_TYPE};
|
||||
use serde::Serialize;
|
||||
use serror::serialize_error;
|
||||
|
||||
pub struct Response(pub axum::response::Response);
|
||||
|
||||
impl<T> From<T> for Response
|
||||
where
|
||||
T: Serialize,
|
||||
{
|
||||
fn from(value: T) -> Response {
|
||||
let res = match serde_json::to_string(&value)
|
||||
.context("failed to serialize response body")
|
||||
{
|
||||
Ok(body) => axum::response::Response::builder()
|
||||
.header(
|
||||
CONTENT_TYPE,
|
||||
HeaderValue::from_static("application/json"),
|
||||
)
|
||||
.body(axum::body::Body::from(body))
|
||||
.unwrap(),
|
||||
Err(e) => axum::response::Response::builder()
|
||||
.status(StatusCode::INTERNAL_SERVER_ERROR)
|
||||
.header(
|
||||
CONTENT_TYPE,
|
||||
HeaderValue::from_static("application/json"),
|
||||
)
|
||||
.body(axum::body::Body::from(serialize_error(&e)))
|
||||
.unwrap(),
|
||||
};
|
||||
Response(res)
|
||||
}
|
||||
}
|
||||
|
||||
pub enum JsonString {
|
||||
Ok(String),
|
||||
Err(serde_json::Error),
|
||||
}
|
||||
|
||||
impl<T> From<T> for JsonString
|
||||
where
|
||||
T: Serialize,
|
||||
{
|
||||
fn from(value: T) -> JsonString {
|
||||
match serde_json::to_string(&value) {
|
||||
Ok(body) => JsonString::Ok(body),
|
||||
Err(e) => JsonString::Err(e),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl JsonString {
|
||||
pub fn into_response(self) -> axum::response::Response {
|
||||
match self {
|
||||
JsonString::Ok(body) => axum::response::Response::builder()
|
||||
.header(
|
||||
CONTENT_TYPE,
|
||||
HeaderValue::from_static("application/json"),
|
||||
)
|
||||
.body(axum::body::Body::from(body))
|
||||
.unwrap(),
|
||||
JsonString::Err(error) => axum::response::Response::builder()
|
||||
.status(StatusCode::INTERNAL_SERVER_ERROR)
|
||||
.header(
|
||||
CONTENT_TYPE,
|
||||
HeaderValue::from_static("application/json"),
|
||||
)
|
||||
.body(axum::body::Body::from(serialize_error(
|
||||
&anyhow::Error::from(error)
|
||||
.context("failed to serialize response body"),
|
||||
)))
|
||||
.unwrap(),
|
||||
}
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user