-
Notifications
You must be signed in to change notification settings - Fork 6
Splitting test into unit and integration tests to match with Rust testing standards #114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 42 commits
Commits
Show all changes
57 commits
Select commit
Hold shift + click to select a range
844fdff
Move to dev
764387d
Remove uneeded results and change to return iterator
Synicix a03db37
Remove accident comment
Synicix abbb5ce
Add ground work for pod operator
Synicix 450a90f
Add async operation for JoinOperator
Synicix e4b3a54
Add pod operator + better error handling for pod_result:::new()
Synicix fee8d9c
Improve error handling for file io in localfilestore
Synicix c04335f
Update operator to be in operation again
Synicix ebe04d1
Fix pod result hashing bug
Synicix 20f2345
Apply feedback from review
Synicix 51dfea6
Merge branch 'worktree-docker-patch' into 106
Synicix cb72829
Remove stale error
Synicix d36ec71
Merge branch 'worktree-docker-patch' into 106
Synicix 2986e0d
Save changes
Synicix 0e3dc71
Merge branch '106' into pipeline_runner
Synicix 595a12e
Split model in core to match uniffi
Synicix b671dba
Merge branch 'model_split' into 106
Synicix ac363f2
Merge branch '106' into runner
Synicix 86d5f70
Save progress
Synicix aead411
Merge branch 'dev' into model_split
Synicix 58ba198
Fix missing stuff
Synicix 9c72bd8
Merge branch 'model_split' into runner
Synicix 7ee5251
save change
Synicix fda43aa
Update pipeline_runner to use new operator + improvments
Synicix 1e9a19f
Readded tests
Synicix 32299f8
save progress
Synicix e93c32b
Merge remote-tracking branch 'upstream' into runner
Synicix ab900b6
Fix logic bug
Synicix 4f511ce
Update tests and test fixture to merge sentence correctly
Synicix 2a4f1f7
Fix agent event to make it more efficient
Synicix c89c700
Remove empty impl
Synicix a834ed9
Fix remaining test to deal with issue
Synicix 7863cad
Fix memory bug
Synicix b8c2de2
Update rust version
Synicix acbd967
Remove into iter()
Synicix ae137ed
Fix old lint that doesn't apply anymore, was hidden by rust analyzer
Synicix f5fefb0
Add PipelineStatus and failure logs
Synicix 75336de
Remove notebook
Synicix f9adf0a
Remove stale functions
Synicix 9795458
Remove unused imports
Synicix 4fae330
Merge remote-tracking branch 'upstream/dev' into logging
Synicix 53e972f
Merge branch 'orch_tests' into logging_patch
Synicix 9de08e1
reintergrate logging into podresult
Synicix 3ace99d
Fix code according to copilot suggestions
Synicix cd0e675
Revert test feature to be more inline with rust testing standards
Synicix a7bf03c
fix copilot suggestions
Synicix 781ffcc
Update cspell
Synicix d16da30
Fix copilot suggestions
Synicix 9a38076
Fix copilot suggestions
Synicix 31533db
Merge branch 'runner' into test_split
Synicix a2f44cf
Merge branch 'dev' into runner
eywalker af14f92
Fix merging issues
Synicix 78ceed6
Fix clippy issues and orchestrator bugs
Synicix dd5c44f
Remove clippy except since github action complained
Synicix f9b53ba
Apply fixes requests from clippy
Synicix 44b4245
Merge branch 'runner' into logging_patch
Synicix e185b65
Merge branch 'logging_patch' into test_split
Synicix File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| excessive-nesting-threshold = 4 | ||
| excessive-nesting-threshold = 5 | ||
| too-many-arguments-threshold = 10 | ||
| allowed-idents-below-min-chars = ["..", "k", "v", "f", "re", "id", "Ok", "'_"] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,65 @@ | ||
| { | ||
| "cells": [ | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": 1, | ||
| "id": "3ba2aae6", | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "import orcapod as op" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "id": "4d867643", | ||
| "metadata": {}, | ||
| "outputs": [ | ||
| { | ||
| "ename": "TypeError", | ||
| "evalue": "Pipeline.__init__() missing 4 required positional arguments: 'graph_dot', 'metadata', 'input_spec', and 'output_spec'", | ||
| "output_type": "error", | ||
| "traceback": [ | ||
| "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", | ||
| "\u001b[0;31mTypeError\u001b[0m Traceback (most recent call last)", | ||
| "Cell \u001b[0;32mIn[3], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[43mop\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mPipeline\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n", | ||
| "\u001b[0;31mTypeError\u001b[0m: Pipeline.__init__() missing 4 required positional arguments: 'graph_dot', 'metadata', 'input_spec', and 'output_spec'" | ||
| ] | ||
| } | ||
| ], | ||
| "source": [ | ||
| "op.Pipeline()" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "id": "7626744f", | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [] | ||
| } | ||
| ], | ||
| "metadata": { | ||
| "kernelspec": { | ||
| "display_name": "orcapod", | ||
| "language": "python", | ||
| "name": "python3" | ||
| }, | ||
| "language_info": { | ||
| "codemirror_mode": { | ||
| "name": "ipython", | ||
| "version": 3 | ||
| }, | ||
| "file_extension": ".py", | ||
| "mimetype": "text/x-python", | ||
| "name": "python", | ||
| "nbconvert_exporter": "python", | ||
| "pygments_lexer": "ipython3", | ||
| "version": "3.10.18" | ||
| } | ||
| }, | ||
| "nbformat": 4, | ||
| "nbformat_minor": 5 | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,40 +1,14 @@ | ||
| macro_rules! inner_attr_to_each { | ||
| { #!$attr:tt $($it:item)* } => { | ||
| $( | ||
| #$attr | ||
| $it | ||
| )* | ||
| } | ||
| } | ||
|
|
||
| pub(crate) mod error; | ||
| pub(crate) mod graph; | ||
| pub(crate) mod pipeline; | ||
| pub(crate) mod store; | ||
| pub(crate) mod util; | ||
| pub(crate) mod validation; | ||
|
|
||
| inner_attr_to_each! { | ||
| #![cfg(feature = "default")] | ||
| pub(crate) mod crypto; | ||
| pub(crate) mod model; | ||
| pub(crate) mod operator; | ||
| pub(crate) mod orchestrator; | ||
| } | ||
| pub(crate) mod crypto; | ||
| /// Model definition for orcapod | ||
| pub mod model; | ||
| pub(crate) mod operator; | ||
| pub(crate) mod orchestrator; | ||
|
|
||
| #[cfg(feature = "test")] | ||
| inner_attr_to_each! { | ||
| #![cfg_attr( | ||
| feature = "test", | ||
| allow( | ||
| missing_docs, | ||
| clippy::missing_errors_doc, | ||
| clippy::missing_panics_doc, | ||
| reason = "Documentation not necessary since private API.", | ||
| ), | ||
| )] | ||
| pub mod crypto; | ||
| pub mod model; | ||
| pub mod operator; | ||
| pub mod orchestrator; | ||
| } | ||
| /// Pipeline runner module | ||
| pub mod pipeline_runner; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,109 @@ | ||
| use std::{ | ||
| backtrace::Backtrace, | ||
| collections::{HashMap, HashSet}, | ||
| }; | ||
|
|
||
| use crate::uniffi::{ | ||
| error::{Kind, OrcaError, Result}, | ||
| model::{ | ||
| packet::PathSet, | ||
| pipeline::{Kernel, Pipeline, PipelineJob}, | ||
| }, | ||
| }; | ||
| use itertools::Itertools as _; | ||
| use petgraph::Direction::Incoming; | ||
| use serde::{Deserialize, Serialize}; | ||
|
|
||
| #[derive(Debug, Clone, Deserialize, Serialize, PartialEq)] | ||
| pub struct PipelineNode { | ||
| pub id: String, | ||
| pub kernel: Kernel, | ||
| } | ||
|
|
||
| impl Pipeline { | ||
| /// Function to get the parents of a node | ||
| pub(crate) fn get_node_parents( | ||
| &self, | ||
| node: &PipelineNode, | ||
| ) -> impl Iterator<Item = &PipelineNode> { | ||
| // Find the NodeIndex for the given node_key | ||
| let node_index = self | ||
| .graph | ||
| .node_indices() | ||
| .find(|&idx| self.graph[idx] == *node); | ||
| node_index.into_iter().flat_map(move |idx| { | ||
| self.graph | ||
| .neighbors_directed(idx, Incoming) | ||
| .map(move |parent_idx| &self.graph[parent_idx]) | ||
| }) | ||
| } | ||
|
|
||
| /// Return a vec of `node_names` that takes in inputs based on the `input_spec` | ||
| pub(crate) fn get_input_nodes(&self) -> HashSet<&String> { | ||
| let mut input_nodes = HashSet::new(); | ||
|
|
||
| self.input_spec.iter().for_each(|(_, node_uris)| { | ||
| for node_uri in node_uris { | ||
| input_nodes.insert(&node_uri.node_id); | ||
| } | ||
| }); | ||
|
|
||
| input_nodes | ||
| } | ||
| } | ||
|
|
||
| impl PipelineJob { | ||
| /// Helpful function to get the input packet for input nodes of the pipeline based on the `pipeline_job` an`pipeline_spec`ec | ||
| /// # Errors | ||
| /// Will return `Err` if there is an issue getting the input packet per node. | ||
| /// # Returns | ||
| /// A `HashMap` where the key is the node name and the value is a vector of `HashMap<String, PathSet>` representing the input packets for that node. | ||
| pub fn get_input_packet_per_node( | ||
| &self, | ||
| ) -> Result<HashMap<String, Vec<HashMap<String, PathSet>>>> { | ||
| // For each node in the input specification, we will iterate over its mapping | ||
| // nodes_input_spec contains <node_id, HashMap<key, PathSet>> | ||
| let mut nodes_input_spec = HashMap::new(); | ||
| for (input_key, node_uris) in &self.pipeline.input_spec { | ||
| for node_uri in node_uris { | ||
| let input_path_sets = self.input_packet.get(input_key).ok_or(OrcaError { | ||
| kind: Kind::KeyMissing { | ||
| key: input_key.clone(), | ||
| backtrace: Some(Backtrace::capture()), | ||
| }, | ||
| })?; | ||
| // There shouldn't be a duplicate key in the input packet as this will be handle by pipeline verify | ||
| let input_spec = nodes_input_spec | ||
| .entry(&node_uri.node_id) | ||
| .or_insert_with(HashMap::new); | ||
| input_spec.insert(&node_uri.key, input_path_sets); | ||
| } | ||
| } | ||
|
|
||
| // For each node, compute the cartesian product of the path_sets for each unique combination of keys | ||
| let node_input_packets = nodes_input_spec | ||
| .into_iter() | ||
| .map(|(node_id, input_node_keys)| { | ||
| // We need to pull them out at the same time to ensure the key order is preserve to match the cartesian product | ||
| let (keys, values): (Vec<_>, Vec<_>) = input_node_keys.into_iter().unzip(); | ||
|
|
||
| // Covert each combo into a packet | ||
| let packets = values | ||
| .into_iter() | ||
| .multi_cartesian_product() | ||
| .map(|combo| { | ||
| keys.iter() | ||
| .copied() | ||
| .zip(combo) | ||
| .map(|(key, pathset)| (key.to_owned(), pathset.to_owned())) | ||
| .collect::<HashMap<_, _>>() | ||
| }) | ||
| .collect::<Vec<HashMap<String, PathSet>>>(); | ||
|
|
||
| (node_id.to_owned(), packets) | ||
| }) | ||
| .collect::<HashMap<_, _>>(); | ||
|
|
||
| Ok(node_input_packets) | ||
| } | ||
| } |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.