Immutable cloud sheets backed append only ledger
FileAdapter
.Add the following to your Cargo.toml:
[dependencies]
feed-my-ledger = "2.0.0"
use feed_my_ledger::core::{Ledger, Record};
fn main() {
let mut ledger = Ledger::default();
let record = Record::new(
"Sample transaction".into(),
"cash".into(),
"revenue".into(),
100.0,
"USD".into(),
None,
None,
vec!["example".into()],
)
.unwrap();
ledger.append(record);
}
To work with a live Google Sheet, construct a GoogleSheets4Adapter
that
communicates with the official Google Sheets REST API. This approach avoids
extra thirdโparty wrappers and keeps the dependency surface minimal. You may
optionally specify the worksheet name when creating the adapter; otherwise, it
defaults to Ledger
:
use feed_my_ledger::cloud_adapters::GoogleSheets4Adapter;
use yup_oauth2::{self, InstalledFlowAuthenticator, InstalledFlowReturnMethod};
async fn example() -> Result<(), Box<dyn std::error::Error>> {
let secret = yup_oauth2::read_application_secret("client_secret.json").await?;
let auth = yup_oauth2::InstalledFlowAuthenticator::builder(
secret,
yup_oauth2::InstalledFlowReturnMethod::Interactive,
)
.build()
.await?;
let mut service = GoogleSheets4Adapter::with_sheet_name(auth, "Custom");
let sheet_id = service.create_sheet("ledger")?;
service.append_row(&sheet_id, vec!["hello".into()])?;
Ok(())
}
To integrate with Microsoft Excel 365 instead, use the Excel365Adapter
which
talks to the Microsoft Graph API:
use feed_my_ledger::cloud_adapters::Excel365Adapter;
fn main() -> Result<(), Box<dyn std::error::Error>> {
// `auth` must provide OAuth tokens scoped for Microsoft Graph
let mut service = Excel365Adapter::new(auth);
let sheet_id = service.create_sheet("ledger")?;
service.append_row(&sheet_id, vec!["hello".into()])?;
Ok(())
}
If you prefer to avoid cloud services entirely, FileAdapter
stores rows in local CSV files:
use feed_my_ledger::cloud_adapters::FileAdapter;
fn main() -> Result<(), Box<dyn std::error::Error>> {
let mut service = FileAdapter::new("./ledger_data");
let sheet_id = service.create_sheet("ledger")?;
service.append_row(&sheet_id, vec!["hello".into()])?;
Ok(())
}
The crate ships with a small CLI for local experimentation. To add a record and view the stored data:
$ cargo run --bin ledger -- add \
--description "Coffee" \
--debit cash --credit expenses \
--amount 3.5 --currency USD
$ cargo run --bin ledger -- list
Add --local-dir <DIR>
to store data in local CSV files:
$ cargo run --bin ledger -- --local-dir ledger_data add \
--description "Coffee" \
--debit cash --credit expenses \
--amount 3.5 --currency USD
$ cargo run --bin ledger -- --local-dir ledger_data list
Split transactions use the same command with an additional --splits
argument
containing a JSON array of extra postings:
$ cargo run --bin ledger -- add \
--description "Shopping" \
--debit expenses:grocery --credit cash \
--amount 30 --currency USD \
--splits '[{"debit":"expenses:supplies","credit":"cash","amount":20}]'
Before issuing API commands for the first time, authorize the application:
$ cargo run --bin ledger -- login
Adjustments reference an existing record by ID:
$ cargo run --bin ledger -- adjust \
--id <RECORD_ID> --description "Refund" \
--debit expenses --credit cash \
--amount 3.5 --currency USD
Share the active sheet:
$ cargo run --bin ledger -- share --email someone@example.com
Switch to a different sheet by URL:
$ cargo run --bin ledger -- switch --link "https://docs.google.com/spreadsheets/d/<ID>/edit"
Import statements from existing files. Supported formats are csv, qif, ofx, ledger, and json:
$ cargo run --bin ledger -- import --format csv --file transactions.csv \
--map-description desc --map-debit debit --map-credit credit \
--map-amount value --map-currency curr
Mapping flags override the default column names when importing CSV files.
Ledger text and JSON formats can also be imported:
$ cargo run --bin ledger -- import --format ledger --file statement.ledger
$ cargo run --bin ledger -- import --format json --file data.json
When compiled with the bank-api
feature, you can download statements directly:
$ cargo run --bin ledger -- download --url "https://bank.example.com/statement.ofx"
Verify ledger integrity:
$ cargo run --bin ledger -- verify
FeedMyLedger looks for a config.toml
file in the same directory as the
binary. This file stores your OAuth credentials and the spreadsheet ID used by
the CLI. When running with --local-dir
, only the sheet ID is saved and no
OAuth configuration is needed.
$ touch config.toml
Determine your spreadsheet ID. Open the sheet in your browser and copy the
portion of the URL between /d/
and /edit
, for example
https://docs.google.com/spreadsheets/d/<ID>/edit
.
Create credentials.json
Visit the Google Cloud Console and create or select a project.
Enable the Google Sheets API for that project.
Navigate to APIs & Services > Credentials and choose Create credentials > OAuth client ID. Configure the consent screen if prompted and select Desktop app.
Download the resulting JSON file and save it as credentials.json
in the
project root or another location of your choice.
Reference this path in the credentials_path
field of config.toml
.
name = "MyLedger"
# password = "supersecret" # Optional
[google_sheets]
credentials_path = "path_to_credentials.json"
spreadsheet_id = "<ID>"
# optional: defaults to "Ledger"
sheet_name = "Custom"
[[budgets]]
account = "expenses:food"
amount = 200.0
currency = "USD"
period = "monthly"
[[schedules]]
cron = "0 0 1 * *"
description = "rent"
debit = "expenses:rent"
credit = "cash"
amount = 1000.0
currency = "USD"
sheet_name
for all ledger operations.To work with Microsoft Excel 365 you will need an application registered in Azure and a workbook stored in OneDrive or SharePoint.
config.toml
:
[excel_365]
tenant_id = "<TENANT_ID>"
client_id = "<CLIENT_ID>"
client_secret = "<CLIENT_SECRET>"
workbook_id = "<WORKBOOK_ID>"
# optional: defaults to "Ledger"
sheet_name = "Ledger"
Excel365Adapter
in your own
application. The CLI does not yet load this configuration automatically.cargo test
Comprehensive documentation is available in the docs directory, covering:
Contributions are welcome! Please read the CONTRIBUTING for guidelines on how to contribute to this project
This project is licensed under the MIT License - see the LICENSE file for details.