mirror of
https://github.com/marcogll/TaxHacker_s23.git
synced 2026-01-13 21:35:19 +00:00
fix: simplify docker and readme
This commit is contained in:
@@ -2,7 +2,7 @@ FROM node:23-slim AS base
|
|||||||
|
|
||||||
# Default environment variables
|
# Default environment variables
|
||||||
ENV PORT=7331
|
ENV PORT=7331
|
||||||
ENV UPLOAD_PATH=/app/uploads
|
ENV UPLOAD_PATH=/app/data/uploads
|
||||||
ENV NODE_ENV=production
|
ENV NODE_ENV=production
|
||||||
ENV DATABASE_URL=file:/app/data/db.sqlite
|
ENV DATABASE_URL=file:/app/data/db.sqlite
|
||||||
|
|
||||||
|
|||||||
21
README.md
21
README.md
@@ -21,11 +21,11 @@ I'm a small self-hosted accountant app that can help you deal with invoices, rec
|
|||||||
|
|
||||||
## 👋🏻 Getting Started
|
## 👋🏻 Getting Started
|
||||||
|
|
||||||
TaxHacker is a self-hosted accounting app for freelancers and small businesses who want to save time and automate tracking expences and income with power of GenAI. It can recognise uploaded photos or PDF files and automatically extract important transaction data: name, total amount, date, category, VAT amount, etc, and save it in a table in a structured way.
|
TaxHacker is a self-hosted accounting app for freelancers and small businesses who want to save time and automate expences and income tracking with power of GenAI. It can recognise uploaded photos, receipts or PDFs and extract important data (e.g. name, total amount, date, merchant, VAT) and save it as structured transactions to a table. You can also create your own custom fields to extract with your LLM prompts.
|
||||||
|
|
||||||
Automatic currency conversion on a day of transaction is also supported (even for crypto). TaxHacker can save you time filling endless Excel spreadsheets with expences and income.
|
It supports automatic currency conversion on a day of transaction. Even for crypto!
|
||||||
|
|
||||||
A built-in system of powerful filters allows you to then export transactions with their files in the specified period of time for tax filing or other reporting.
|
Built-in system of filters, support for multiple projects, import-export of transactions for a certain time (along with attached files) and custom categories, allows you to simplify reporting and tax filing.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
@@ -45,9 +45,9 @@ A built-in system of powerful filters allows you to then export transactions wit
|
|||||||
|
|
||||||
Take a photo on upload or a PDF and TaxHacker will automatically recognise, categorise and store transaction information.
|
Take a photo on upload or a PDF and TaxHacker will automatically recognise, categorise and store transaction information.
|
||||||
|
|
||||||
- Upload multiple documents and store in “unsorted” until you get the time to sort them out with AI
|
- Upload multiple documents and store in “unsorted” until you get the time to sort them out by hand or with an AI
|
||||||
- Use LLM to extract key information like date, amount, and vendor
|
- Use LLM to extract key information like date, amount, and vendor
|
||||||
- Categorize transactions based on content
|
- Automatically categorize transactions based on its content
|
||||||
- Store everything in a structured format for easy filtering and retrieval
|
- Store everything in a structured format for easy filtering and retrieval
|
||||||
- Organize your documents by a tax season
|
- Organize your documents by a tax season
|
||||||
|
|
||||||
@@ -115,7 +115,9 @@ docker compose up
|
|||||||
|
|
||||||
New docker image is automatically built and published on every new release. You can use specific version tags (e.g. `v1.0.0`) or `latest` for the most recent version.
|
New docker image is automatically built and published on every new release. You can use specific version tags (e.g. `v1.0.0`) or `latest` for the most recent version.
|
||||||
|
|
||||||
For more advanced setups, you can adapt Docker Compose configuration to your own needs. The default configuration uses the pre-built image from GHCR, but you can still build locally using the provided [Dockerfile](./Dockerfile) if needed.
|
For more advanced setups, you can adapt Docker Compose configuration to your own needs. The default configuration uses the pre-built image from GHCR, but you can still build locally using the provided [Dockerfile](./Dockerfile) if needed.
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
services:
|
services:
|
||||||
@@ -124,11 +126,10 @@ services:
|
|||||||
ports:
|
ports:
|
||||||
- "7331:7331"
|
- "7331:7331"
|
||||||
environment:
|
environment:
|
||||||
- UPLOAD_PATH=/app/uploads
|
|
||||||
- NODE_ENV=production
|
- NODE_ENV=production
|
||||||
|
- UPLOAD_PATH=/app/data/uploads
|
||||||
- DATABASE_URL=file:/app/data/db.sqlite
|
- DATABASE_URL=file:/app/data/db.sqlite
|
||||||
volumes:
|
volumes:
|
||||||
- ./uploads:/app/uploads
|
|
||||||
- ./data:/app/data
|
- ./data:/app/data
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
```
|
```
|
||||||
@@ -148,7 +149,7 @@ Configure TaxHacker to suit your needs with these environment variables:
|
|||||||
We use:
|
We use:
|
||||||
|
|
||||||
- Next.js version 15+ or later
|
- Next.js version 15+ or later
|
||||||
- [Prisma](https://www.prisma.io/) for database ORM and migrations
|
- [Prisma](https://www.prisma.io/) for database models and migrations
|
||||||
- SQLite as a database
|
- SQLite as a database
|
||||||
- Ghostscript and graphicsmagick libs for PDF files (can be installed on macOS via `brew install gs graphicsmagick`)
|
- Ghostscript and graphicsmagick libs for PDF files (can be installed on macOS via `brew install gs graphicsmagick`)
|
||||||
|
|
||||||
@@ -167,7 +168,7 @@ cp .env.example .env
|
|||||||
# Edit .env with your configuration
|
# Edit .env with your configuration
|
||||||
|
|
||||||
# Initialize the database
|
# Initialize the database
|
||||||
npx prisma migrate dev && npx prisma generate
|
npx prisma generate && npx prisma migrate dev
|
||||||
|
|
||||||
# Seed the database with default data (optional)
|
# Seed the database with default data (optional)
|
||||||
npm run seed
|
npm run seed
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
import { Button } from "@/components/ui/button"
|
import { Button } from "@/components/ui/button"
|
||||||
import { Field } from "@prisma/client"
|
import { Field } from "@prisma/client"
|
||||||
import { Upload } from "lucide-react"
|
import { Loader2, Play, Upload } from "lucide-react"
|
||||||
import { useRouter } from "next/navigation"
|
import { useRouter } from "next/navigation"
|
||||||
import { startTransition, useActionState, useEffect, useState } from "react"
|
import { startTransition, useActionState, useEffect, useState } from "react"
|
||||||
import { parseCSVAction, saveTransactionsAction } from "../../app/import/csv/actions"
|
import { parseCSVAction, saveTransactionsAction } from "../../app/import/csv/actions"
|
||||||
@@ -12,8 +12,8 @@ const MAX_PREVIEW_ROWS = 100
|
|||||||
|
|
||||||
export function ImportCSVTable({ fields }: { fields: Field[] }) {
|
export function ImportCSVTable({ fields }: { fields: Field[] }) {
|
||||||
const router = useRouter()
|
const router = useRouter()
|
||||||
const [parseState, parseAction] = useActionState(parseCSVAction, null)
|
const [parseState, parseAction, isParsing] = useActionState(parseCSVAction, null)
|
||||||
const [saveState, saveAction] = useActionState(saveTransactionsAction, null)
|
const [saveState, saveAction, isSaving] = useActionState(saveTransactionsAction, null)
|
||||||
|
|
||||||
const [csvSettings, setCSVSettings] = useState({
|
const [csvSettings, setCSVSettings] = useState({
|
||||||
skipHeader: true,
|
skipHeader: true,
|
||||||
@@ -100,7 +100,7 @@ export function ImportCSVTable({ fields }: { fields: Field[] }) {
|
|||||||
<div>
|
<div>
|
||||||
<input type="file" accept=".csv" className="hidden" id="csv-file" onChange={handleFileChange} />
|
<input type="file" accept=".csv" className="hidden" id="csv-file" onChange={handleFileChange} />
|
||||||
<Button type="button" onClick={() => document.getElementById("csv-file")?.click()}>
|
<Button type="button" onClick={() => document.getElementById("csv-file")?.click()}>
|
||||||
<Upload className="mr-2" /> Import from CSV
|
{isParsing ? "Parsing..." : <Upload className="mr-2" />} Import from CSV
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -115,12 +115,22 @@ export function ImportCSVTable({ fields }: { fields: Field[] }) {
|
|||||||
<span className="text-3xl font-bold tracking-tight">Import {csvData.length} items from CSV</span>
|
<span className="text-3xl font-bold tracking-tight">Import {csvData.length} items from CSV</span>
|
||||||
</h2>
|
</h2>
|
||||||
<div className="flex gap-2">
|
<div className="flex gap-2">
|
||||||
<Button onClick={handleSave} disabled={saveState?.success}>
|
<Button onClick={handleSave} disabled={isSaving}>
|
||||||
Import Transactions
|
{isSaving ? (
|
||||||
|
<>
|
||||||
|
<Loader2 className="animate-spin" /> Importing...
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<Play /> Import {csvData.length} transactions
|
||||||
|
</>
|
||||||
|
)}
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</header>
|
</header>
|
||||||
|
|
||||||
|
{saveState?.error && <FormError>{saveState.error}</FormError>}
|
||||||
|
|
||||||
<div className="flex items-center gap-4 mb-4">
|
<div className="flex items-center gap-4 mb-4">
|
||||||
<label className="flex items-center gap-2 cursor-pointer">
|
<label className="flex items-center gap-2 cursor-pointer">
|
||||||
<input
|
<input
|
||||||
@@ -180,8 +190,6 @@ export function ImportCSVTable({ fields }: { fields: Field[] }) {
|
|||||||
{csvData.length > MAX_PREVIEW_ROWS && (
|
{csvData.length > MAX_PREVIEW_ROWS && (
|
||||||
<p className="text-muted-foreground mt-4">and {csvData.length - MAX_PREVIEW_ROWS} more entries...</p>
|
<p className="text-muted-foreground mt-4">and {csvData.length - MAX_PREVIEW_ROWS} more entries...</p>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{saveState?.error && <FormError>{saveState.error}</FormError>}
|
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</>
|
</>
|
||||||
|
|||||||
@@ -6,11 +6,10 @@ services:
|
|||||||
ports:
|
ports:
|
||||||
- "7331:7331"
|
- "7331:7331"
|
||||||
environment:
|
environment:
|
||||||
- UPLOAD_PATH=/app/uploads
|
|
||||||
- NODE_ENV=production
|
- NODE_ENV=production
|
||||||
|
- UPLOAD_PATH=/app/data/uploads
|
||||||
- DATABASE_URL=file:/app/data/db.sqlite
|
- DATABASE_URL=file:/app/data/db.sqlite
|
||||||
volumes:
|
volumes:
|
||||||
- ./uploads:/app/uploads
|
|
||||||
- ./data:/app/data
|
- ./data:/app/data
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
logging:
|
logging:
|
||||||
|
|||||||
@@ -1,58 +0,0 @@
|
|||||||
services:
|
|
||||||
app:
|
|
||||||
build:
|
|
||||||
context: .
|
|
||||||
dockerfile: Dockerfile
|
|
||||||
environment:
|
|
||||||
- UPLOAD_PATH=/app/uploads
|
|
||||||
- NODE_ENV=production
|
|
||||||
- DATABASE_URL=file:/app/data/db.sqlite
|
|
||||||
volumes:
|
|
||||||
- ./uploads:/app/uploads
|
|
||||||
- ./data:/app/data
|
|
||||||
restart: unless-stopped
|
|
||||||
networks:
|
|
||||||
- traefik
|
|
||||||
labels:
|
|
||||||
- "traefik.enable=true"
|
|
||||||
- "traefik.http.routers.taxhacker.entrypoints=https"
|
|
||||||
- "traefik.http.routers.taxhacker.tls.certresolver=cloudflare"
|
|
||||||
- "traefik.http.routers.taxhacker.rule=Host(`tax.vas3k.cloud`)"
|
|
||||||
- "traefik.http.services.taxhacker.loadBalancer.server.port=7331"
|
|
||||||
- "traefik.docker.network=traefik"
|
|
||||||
logging:
|
|
||||||
driver: "local"
|
|
||||||
options:
|
|
||||||
max-size: "100M"
|
|
||||||
max-file: "3"
|
|
||||||
|
|
||||||
snekapp:
|
|
||||||
build:
|
|
||||||
context: .
|
|
||||||
dockerfile: Dockerfile
|
|
||||||
environment:
|
|
||||||
- UPLOAD_PATH=/app/uploads
|
|
||||||
- NODE_ENV=production
|
|
||||||
- DATABASE_URL=file:/app/data/db.sqlite
|
|
||||||
volumes:
|
|
||||||
- ./snek_uploads:/app/uploads
|
|
||||||
- ./snek_data:/app/data
|
|
||||||
restart: unless-stopped
|
|
||||||
networks:
|
|
||||||
- traefik
|
|
||||||
labels:
|
|
||||||
- "traefik.enable=true"
|
|
||||||
- "traefik.http.routers.snektax.entrypoints=https"
|
|
||||||
- "traefik.http.routers.snektax.tls.certresolver=cloudflare"
|
|
||||||
- "traefik.http.routers.snektax.rule=Host(`snektax.vas3k.cloud`)"
|
|
||||||
- "traefik.http.services.snektax.loadBalancer.server.port=7331"
|
|
||||||
- "traefik.docker.network=traefik"
|
|
||||||
logging:
|
|
||||||
driver: "local"
|
|
||||||
options:
|
|
||||||
max-size: "100M"
|
|
||||||
max-file: "3"
|
|
||||||
|
|
||||||
networks:
|
|
||||||
traefik:
|
|
||||||
external: true
|
|
||||||
@@ -4,11 +4,10 @@ services:
|
|||||||
ports:
|
ports:
|
||||||
- "7331:7331"
|
- "7331:7331"
|
||||||
environment:
|
environment:
|
||||||
- UPLOAD_PATH=/app/uploads
|
|
||||||
- NODE_ENV=production
|
- NODE_ENV=production
|
||||||
|
- UPLOAD_PATH=/app/data/uploads
|
||||||
- DATABASE_URL=file:/app/data/db.sqlite
|
- DATABASE_URL=file:/app/data/db.sqlite
|
||||||
volumes:
|
volumes:
|
||||||
- ./uploads:/app/uploads
|
|
||||||
- ./data:/app/data
|
- ./data:/app/data
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
logging:
|
logging:
|
||||||
|
|||||||
Reference in New Issue
Block a user