diff --git a/LineageCheck.md b/LineageCheck.md new file mode 100644 index 0000000..5f6f1c7 --- /dev/null +++ b/LineageCheck.md @@ -0,0 +1,289 @@ +# Neo4j Lineage Verification Guide + +Use this guide to confirm that your plants and LINEAGE relationships have been imported correctly into Neo4j. Save this file as `neo4j_lineage_check.md` for future reference. + +--- + +## 1. Open the Neo4j Browser + +1. **Ensure Neo4j is running.** + In a Docker‐Compose setup, Neo4j is typically exposed at: + + ``` + http://localhost:7474 + ``` +2. **Log in** with your Neo4j credentials (e.g., username `neo4j`, password as configured). + +Once logged in, you can execute Cypher commands in the query pane on the left. + +--- + +## 2. Verify That Your `Plant` Nodes Exist + +Before checking relationships, confirm that nodes were created: + +```cypher +MATCH (p:Plant) +RETURN p.uuid AS uuid, p.name AS common_name +LIMIT 20; +``` + +* This query will return up to 20 plant nodes with their `uuid` and `name` properties. +* If you see your imported plants here, it means the nodes exist in the database. + +--- + +## 3. Check Direct Parent→Child LINEAGE Pairs + +To list all direct child→parent relationships: + +```cypher +MATCH (child:Plant)-[:LINEAGE]->(parent:Plant) +RETURN child.uuid AS child_uuid, parent.uuid AS parent_uuid +LIMIT 50; +``` + +* Each row represents one `(:Plant)-[:LINEAGE]->(:Plant)` relationship. +* `child_uuid` is the UUID of the child node, and `parent_uuid` is the UUID of its direct parent. + +--- + +## 4. Look Up a Specific Plant by UUID or Name + +If you know a particular plant’s UUID, you can confirm its properties: + +```cypher +MATCH (p:Plant {uuid: "YOUR_UUID_HERE"}) +RETURN p.uuid AS uuid, p.name AS common_name, p.scientific_name AS sci_name; +``` + +Alternatively, if you only know the common name: + +```cypher +MATCH (p:Plant) +WHERE p.name = "Common Name Here" +RETURN p.uuid AS uuid, p.name AS common_name, p.scientific_name AS sci_name; +``` + +This helps you find the exact UUID or check that the `name` and `scientific_name` properties were stored correctly. + +--- + +## 5. Show Children of a Given Parent + +To list all direct children of a specific parent by UUID: + +```cypher +MATCH (parent:Plant {uuid: "PARENT_UUID_HERE"})<-[:LINEAGE]-(child:Plant) +RETURN child.uuid AS child_uuid, child.name AS child_name; +``` + +* This returns every plant node that points to the specified `parent_uuid` via a `LINEAGE` relationship. + +--- + +## 6. Visualize a Subtree Around One Node + +To visualize a parent node and its children in graph form: + +```cypher +MATCH subtree = (parent:Plant {uuid: "PARENT_UUID_HERE"})<-[:LINEAGE]-(child:Plant) +RETURN subtree; +``` + +* Switch to the “Graph” view in the Neo4j browser to see a node for the parent with arrows pointing to each child. + +--- + +## 7. Walk the Full Ancestor Chain (Multi‐Level) + +If you want to see all ancestors of a given child, use a variable‐length pattern: + +```cypher +MATCH path = (desc:Plant {uuid: "CHILD_UUID_HERE"})-[:LINEAGE*1..]->(anc:Plant) +RETURN path; +``` + +* `[:LINEAGE*1..]` indicates “follow one or more consecutive `LINEAGE` relationships upward.” +* In “Graph” view, Neo4j will display the entire chain from child → parent → grandparent → … + +To return just the list of ancestor UUIDs: + +```cypher +MATCH (start:Plant {uuid: "CHILD_UUID_HERE"})-[:LINEAGE*1..]->(anc:Plant) +RETURN DISTINCT anc.uuid AS ancestor_uuid; +``` + +--- + +## 8. Show All Descendants of a Given Parent + +To find all descendants (children, grandchildren, etc.) of a root node: + +```cypher +MATCH (root:Plant {uuid: "ROOT_UUID_HERE"})<-[:LINEAGE*]-(desc:Plant) +RETURN desc.uuid AS descendant_uuid, desc.name AS descendant_name; +``` + +* The pattern `[:LINEAGE*]` (with no lower bound specified) matches zero or more hops. +* To visualize the full descendant tree: + + ```cypher + MATCH subtree = (root:Plant {uuid: "ROOT_UUID_HERE"})<-[:LINEAGE*]-(desc:Plant) + RETURN subtree; + ``` + + Then switch to “Graph” view. + +--- + +## 9. Combining Queries for a Full Walk‐Through + +1. **List a few plants** (to copy a known UUID): + + ```cypher + MATCH (p:Plant) + RETURN p.uuid AS uuid, p.name AS common_name + LIMIT 10; + ``` +2. **Pick one UUID** (e.g. `"2ee2e0e7-69de-4b8f-abfe-4ed973c3d760"`). +3. **Show its direct children**: + + ```cypher + MATCH (p:Plant {uuid: "2ee2e0e7-69de-4b8f-abfe-4ed973c3d760"})<-[:LINEAGE]-(child:Plant) + RETURN child.uuid AS child_uuid, child.name AS child_name; + ``` +4. **Show its parent** (if any): + + ```cypher + MATCH (p:Plant {uuid: "8b1059c8-8dd3-487a-af19-1eb548788e87"})-[:LINEAGE]->(parent:Plant) + RETURN parent.uuid AS parent_uuid, parent.name AS parent_name; + ``` +5. **Get the full ancestor chain** of that child: + + ```cypher + MATCH path = (c:Plant {uuid: "8b1059c8-8dd3-487a-af19-1eb548788e87"})-[:LINEAGE*1..]->(anc:Plant) + RETURN path; + ``` +6. **Get the full descendant tree** of that parent: + + ```cypher + MATCH subtree = (root:Plant {uuid: "2ee2e0e7-69de-4b8f-abfe-4ed973c3d760"})<-[:LINEAGE*]-(desc:Plant) + RETURN subtree; + ``` + +--- + +## 10. Checking via Python (Optional) + +If you prefer to script these checks using the Neo4j Bolt driver from Python, here’s a quick example: + +```python +from neo4j import GraphDatabase + +uri = "bolt://localhost:7687" +auth = ("neo4j", "your_password") +driver = GraphDatabase.driver(uri, auth=auth) + +def print_lineage(tx, plant_uuid): + # Show direct parent + result = tx.run( + "MATCH (c:Plant {uuid:$u})-[:LINEAGE]->(p:Plant) " + "RETURN p.uuid AS parent_uuid, p.name AS parent_name", + u=plant_uuid + ) + for row in result: + print(f"Parent of {plant_uuid}: {row['parent_uuid']} ({row['parent_name']})") + + # Show all ancestors + result2 = tx.run( + "MATCH path = (c:Plant {uuid:$u})-[:LINEAGE*1..]->(anc:Plant) " + "RETURN [n IN nodes(path) | n.uuid] AS all_uuids", + u=plant_uuid + ) + for row in result2: + print("Ancestor chain UUIDs:", row["all_uuids"]) + +with driver.session() as session: + session.read_transaction(print_lineage, "8b1059c8-8dd3-487a-af19-1eb548788e87") + +driver.close() +``` + +* Install `neo4j` Python package if needed: + + ```bash + pip install neo4j + ``` +* Adjust the `uri` and `auth` values to match your Neo4j setup. + +--- + +## 11. Summary of Key Cypher Queries + +* **List all plants (sample):** + + ```cypher + MATCH (p:Plant) + RETURN p.uuid AS uuid, p.name AS common_name + LIMIT 20; + ``` + +* **List direct parent→child relationships:** + + ```cypher + MATCH (child:Plant)-[:LINEAGE]->(parent:Plant) + RETURN child.uuid AS child_uuid, parent.uuid AS parent_uuid; + ``` + +* **List direct children of a parent:** + + ```cypher + MATCH (parent:Plant {uuid:"PARENT_UUID"})<-[:LINEAGE]-(child:Plant) + RETURN child.uuid AS child_uuid, child.name AS child_name; + ``` + +* **List direct parent of a child:** + + ```cypher + MATCH (child:Plant {uuid:"CHILD_UUID"})-[:LINEAGE]->(parent:Plant) + RETURN parent.uuid AS parent_uuid, parent.name AS parent_name; + ``` + +* **Visualize parent + children subgraph:** + + ```cypher + MATCH subtree = (parent:Plant {uuid:"PARENT_UUID"})<-[:LINEAGE]-(child:Plant) + RETURN subtree; + ``` + +* **Full ancestor chain for a child:** + + ```cypher + MATCH path = (c:Plant {uuid:"CHILD_UUID"})-[:LINEAGE*1..]->(anc:Plant) + RETURN path; + ``` + +* **Full descendant tree for a parent:** + + ```cypher + MATCH subtree = (root:Plant {uuid:"PARENT_UUID"})<-[:LINEAGE*]-(desc:Plant) + RETURN subtree; + ``` + +--- + +### Usage Tips + +* **Switch between “Table” and “Graph” views** in the Neo4j Browser to see raw data vs. visual graph. +* Use `LIMIT` when you only want a quick preview of results. +* To filter by partial names, you can do: + + ```cypher + MATCH (p:Plant) + WHERE toLower(p.name) CONTAINS toLower("baltic") + RETURN p.uuid, p.name; + ``` +* Remember to enclose string literals in double quotes (`"..."`) and escape any internal quotes if needed. + +Keep this guide handy for whenever you need to verify or debug your Neo4j lineage data! diff --git a/app/__init__.py b/app/__init__.py index b1b25db..9bf3355 100644 --- a/app/__init__.py +++ b/app/__init__.py @@ -3,7 +3,9 @@ import os import json import glob +import importlib import importlib.util + from flask import Flask from flask_sqlalchemy import SQLAlchemy from flask_migrate import Migrate @@ -11,11 +13,10 @@ from flask_login import LoginManager from flask_wtf.csrf import CSRFProtect from dotenv import load_dotenv +# Load environment variables from .env or system load_dotenv() -# ---------------------------------------------------------------- -# 1) Initialize core extensions -# ---------------------------------------------------------------- +# ─── Initialize core extensions ───────────────────────────────────────────────── db = SQLAlchemy() migrate = Migrate() login_manager = LoginManager() @@ -26,37 +27,43 @@ def create_app(): app = Flask(__name__) app.config.from_object('app.config.Config') - # Initialize extensions with app + # ─── Initialize extensions with the app ─────────────────────────────────────── csrf.init_app(app) db.init_app(app) migrate.init_app(app, db) login_manager.init_app(app) login_manager.login_view = 'auth.login' - # ---------------------------------------------------------------- - # 2) Register error handlers - # ---------------------------------------------------------------- + # ─── Register user_loader for Flask-Login ─────────────────────────────────── + from plugins.auth.models import User + + @login_manager.user_loader + def load_user(user_id): + try: + return User.query.get(int(user_id)) + except Exception: + return None + + # ─── Register error handlers ───────────────────────────────────────────────── from .errors import bp as errors_bp app.register_blueprint(errors_bp) - # ---------------------------------------------------------------- - # 3) Auto-load each plugin’s models.py so that SQLAlchemy metadata - # knows about every table (Plant, PlantOwnershipLog, PlantUpdate, etc.) - # ---------------------------------------------------------------- - plugin_model_paths = glob.glob(os.path.join(os.path.dirname(__file__), '..', 'plugins', '*', 'models.py')) + # ─── 1) Auto‐import plugin models by their package names ───────────────────── + # This ensures that every plugins//models.py is imported exactly once + plugin_model_paths = glob.glob( + os.path.join(os.path.dirname(__file__), '..', 'plugins', '*', 'models.py') + ) for path in plugin_model_paths: - module_name = path.replace("/", ".").replace(".py", "") + # path looks like ".../plugins/plant/models.py" + rel = path.split(os.sep)[-2] # e.g. "plant" + pkg = f"plugins.{rel}.models" # e.g. "plugins.plant.models" try: - spec = importlib.util.spec_from_file_location(module_name, path) - mod = importlib.util.module_from_spec(spec) - spec.loader.exec_module(mod) - print(f"✅ (Startup) Loaded: {module_name}") + importlib.import_module(pkg) + print(f"✅ (Startup) Loaded: {pkg}") except Exception as e: - print(f"❌ (Startup) Failed to load {module_name}: {e}") + print(f"❌ (Startup) Failed to load {pkg}: {e}") - # ---------------------------------------------------------------- - # 4) Auto-discover & register each plugin’s routes.py and CLI - # ---------------------------------------------------------------- + # ─── 2) Auto‐discover & register plugin routes, CLI, entry‐points ──────────── plugin_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', 'plugins')) for plugin in os.listdir(plugin_path): if plugin.endswith('.noload'): @@ -67,7 +74,7 @@ def create_app(): if not os.path.isdir(plugin_dir): continue - # --- (a) Register routes blueprint if present --- + # (a) Register routes.py route_file = os.path.join(plugin_dir, 'routes.py') if os.path.isfile(route_file): try: @@ -80,7 +87,7 @@ def create_app(): except Exception as e: print(f"❌ Failed to load routes from plugin '{plugin}': {e}") - # --- (b) Register CLI & entry point if present --- + # (b) Register CLI and entry‐point init_file = os.path.join(plugin_dir, '__init__.py') plugin_json = os.path.join(plugin_dir, 'plugin.json') if os.path.isfile(init_file): @@ -99,6 +106,7 @@ def create_app(): except Exception as e: print(f"❌ Failed to load CLI for plugin '{plugin}': {e}") + # ─── Inject current year into templates ──────────────────────────────────────── @app.context_processor def inject_current_year(): from datetime import datetime diff --git a/files.zip b/files.zip index fdafedc..6eee0a8 100644 Binary files a/files.zip and b/files.zip differ diff --git a/migrations/env.py b/migrations/env.py index cd0a6b3..cf86fdc 100644 --- a/migrations/env.py +++ b/migrations/env.py @@ -1,36 +1,38 @@ from __future__ import with_statement import os import logging -import importlib.util +import glob +import importlib + from alembic import context from sqlalchemy import engine_from_config, pool from logging.config import fileConfig + from flask import current_app from app import db # ----------------------------- -# 🔍 Automatically import all plugin models +# 🔍 Automatically import all plugin models under their real package name # ----------------------------- -import glob -import importlib.util - plugin_model_paths = glob.glob(os.path.join("plugins", "*", "models.py")) for path in plugin_model_paths: - module_name = path.replace("/", ".").replace(".py", "") + # e.g. path = "plugins/plant/models.py" + # We want to turn that into "plugins.plant.models" + rel = path[len("plugins/") : -len("/models.py")] # e.g. "plant" + pkg = f"plugins.{rel}.models" # e.g. "plugins.plant.models" try: - spec = importlib.util.spec_from_file_location(module_name, path) - module = importlib.util.module_from_spec(spec) - spec.loader.exec_module(module) - print(f"✅ Loaded: {module_name}") + importlib.import_module(pkg) + print(f"✅ Loaded: {pkg}") except Exception as e: - print(f"❌ Failed to load {module_name}: {e}") + print(f"❌ Failed to load {pkg}: {e}") # ----------------------------- config = context.config fileConfig(config.config_file_name) -logger = logging.getLogger('alembic.env') +logger = logging.getLogger("alembic.env") +# SQLAlchemy will look at this `target_metadata` when autogenerating target_metadata = db.metadata def run_migrations_offline(): diff --git a/migrations/versions/0fcf1e150ae2_auto.py b/migrations/versions/0fcf1e150ae2_auto.py new file mode 100644 index 0000000..9f17d44 --- /dev/null +++ b/migrations/versions/0fcf1e150ae2_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 0fcf1e150ae2 +Revises: 58516c9892e9 +Create Date: 2025-06-05 09:31:44.116783 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '0fcf1e150ae2' +down_revision = '58516c9892e9' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/373571dfe134_auto.py b/migrations/versions/373571dfe134_auto.py new file mode 100644 index 0000000..4b88b08 --- /dev/null +++ b/migrations/versions/373571dfe134_auto.py @@ -0,0 +1,70 @@ +"""auto + +Revision ID: 373571dfe134 +Revises: 0fcf1e150ae2 +Create Date: 2025-06-05 09:38:55.414193 + +""" +from alembic import op +import sqlalchemy as sa +from sqlalchemy.dialects import mysql + +# revision identifiers, used by Alembic. +revision = '373571dfe134' +down_revision = '0fcf1e150ae2' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.add_column('submission_images', sa.Column('file_url', sa.String(length=256), nullable=False)) + op.add_column('submission_images', sa.Column('uploaded_at', sa.DateTime(), nullable=True)) + op.drop_column('submission_images', 'is_visible') + op.drop_column('submission_images', 'file_path') + op.add_column('submissions', sa.Column('submitted_at', sa.DateTime(), nullable=True)) + op.add_column('submissions', sa.Column('plant_name', sa.String(length=100), nullable=False)) + op.add_column('submissions', sa.Column('approved', sa.Boolean(), nullable=True)) + op.add_column('submissions', sa.Column('approved_at', sa.DateTime(), nullable=True)) + op.add_column('submissions', sa.Column('reviewed_by', sa.Integer(), nullable=True)) + op.drop_constraint(op.f('submissions_ibfk_1'), 'submissions', type_='foreignkey') + op.create_foreign_key(None, 'submissions', 'users', ['reviewed_by'], ['id']) + op.drop_column('submissions', 'common_name') + op.drop_column('submissions', 'height') + op.drop_column('submissions', 'container_size') + op.drop_column('submissions', 'timestamp') + op.drop_column('submissions', 'price') + op.drop_column('submissions', 'plant_id') + op.drop_column('submissions', 'width') + op.drop_column('submissions', 'health_status') + op.drop_column('submissions', 'leaf_count') + op.drop_column('submissions', 'potting_mix') + op.drop_column('submissions', 'source') + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.add_column('submissions', sa.Column('source', mysql.VARCHAR(length=120), nullable=True)) + op.add_column('submissions', sa.Column('potting_mix', mysql.VARCHAR(length=255), nullable=True)) + op.add_column('submissions', sa.Column('leaf_count', mysql.INTEGER(), autoincrement=False, nullable=True)) + op.add_column('submissions', sa.Column('health_status', mysql.VARCHAR(length=50), nullable=True)) + op.add_column('submissions', sa.Column('width', mysql.FLOAT(), nullable=True)) + op.add_column('submissions', sa.Column('plant_id', mysql.INTEGER(), autoincrement=False, nullable=True)) + op.add_column('submissions', sa.Column('price', mysql.FLOAT(), nullable=False)) + op.add_column('submissions', sa.Column('timestamp', mysql.DATETIME(), nullable=True)) + op.add_column('submissions', sa.Column('container_size', mysql.VARCHAR(length=120), nullable=True)) + op.add_column('submissions', sa.Column('height', mysql.FLOAT(), nullable=True)) + op.add_column('submissions', sa.Column('common_name', mysql.VARCHAR(length=120), nullable=False)) + op.drop_constraint(None, 'submissions', type_='foreignkey') + op.create_foreign_key(op.f('submissions_ibfk_1'), 'submissions', 'plant', ['plant_id'], ['id']) + op.drop_column('submissions', 'reviewed_by') + op.drop_column('submissions', 'approved_at') + op.drop_column('submissions', 'approved') + op.drop_column('submissions', 'plant_name') + op.drop_column('submissions', 'submitted_at') + op.add_column('submission_images', sa.Column('file_path', mysql.VARCHAR(length=255), nullable=False)) + op.add_column('submission_images', sa.Column('is_visible', mysql.TINYINT(display_width=1), autoincrement=False, nullable=True)) + op.drop_column('submission_images', 'uploaded_at') + op.drop_column('submission_images', 'file_url') + # ### end Alembic commands ### diff --git a/migrations/versions/401f262d79cc_auto.py b/migrations/versions/401f262d79cc_auto.py new file mode 100644 index 0000000..b978aa9 --- /dev/null +++ b/migrations/versions/401f262d79cc_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 401f262d79cc +Revises: 583fab3f9f80 +Create Date: 2025-06-05 04:48:49.440383 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '401f262d79cc' +down_revision = '583fab3f9f80' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/4bdec754b085_auto.py b/migrations/versions/4bdec754b085_auto.py new file mode 100644 index 0000000..ba4aa01 --- /dev/null +++ b/migrations/versions/4bdec754b085_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 4bdec754b085 +Revises: 27a65a4e055c +Create Date: 2025-06-05 04:34:19.085549 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '4bdec754b085' +down_revision = '27a65a4e055c' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/501b54868875_auto.py b/migrations/versions/501b54868875_auto.py new file mode 100644 index 0000000..ef2eda8 --- /dev/null +++ b/migrations/versions/501b54868875_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 501b54868875 +Revises: 401f262d79cc +Create Date: 2025-06-05 04:51:52.183453 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '501b54868875' +down_revision = '401f262d79cc' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/583fab3f9f80_auto.py b/migrations/versions/583fab3f9f80_auto.py new file mode 100644 index 0000000..fe2425d --- /dev/null +++ b/migrations/versions/583fab3f9f80_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 583fab3f9f80 +Revises: 64ec4065d18d +Create Date: 2025-06-05 04:47:05.679772 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '583fab3f9f80' +down_revision = '64ec4065d18d' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/58516c9892e9_auto.py b/migrations/versions/58516c9892e9_auto.py new file mode 100644 index 0000000..25f0e73 --- /dev/null +++ b/migrations/versions/58516c9892e9_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 58516c9892e9 +Revises: 85da58851d35 +Create Date: 2025-06-05 05:28:30.947641 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '58516c9892e9' +down_revision = '85da58851d35' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/5c85ebc9451b_auto.py b/migrations/versions/5c85ebc9451b_auto.py new file mode 100644 index 0000000..1588122 --- /dev/null +++ b/migrations/versions/5c85ebc9451b_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 5c85ebc9451b +Revises: d8bfe4d4c083 +Create Date: 2025-06-05 09:47:14.478039 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '5c85ebc9451b' +down_revision = 'd8bfe4d4c083' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/64ec4065d18d_auto.py b/migrations/versions/64ec4065d18d_auto.py new file mode 100644 index 0000000..3e11054 --- /dev/null +++ b/migrations/versions/64ec4065d18d_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 64ec4065d18d +Revises: 4bdec754b085 +Create Date: 2025-06-05 04:40:02.186807 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '64ec4065d18d' +down_revision = '4bdec754b085' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/72455429fdaf_auto.py b/migrations/versions/72455429fdaf_auto.py new file mode 100644 index 0000000..9cb4fb5 --- /dev/null +++ b/migrations/versions/72455429fdaf_auto.py @@ -0,0 +1,46 @@ +"""auto + +Revision ID: 72455429fdaf +Revises: 501b54868875 +Create Date: 2025-06-05 05:07:43.605568 + +""" +from alembic import op +import sqlalchemy as sa +from sqlalchemy.dialects import mysql + +# revision identifiers, used by Alembic. +revision = '72455429fdaf' +down_revision = '501b54868875' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('transfer_request', + sa.Column('id', sa.Integer(), autoincrement=True, nullable=False), + sa.Column('plant_id', sa.Integer(), nullable=False), + sa.Column('seller_id', sa.Integer(), nullable=False), + sa.Column('buyer_id', sa.Integer(), nullable=False), + sa.Column('status', sa.String(length=20), nullable=False), + sa.Column('created_at', sa.DateTime(), nullable=True), + sa.Column('updated_at', sa.DateTime(), nullable=True), + sa.Column('seller_message', sa.String(length=512), nullable=True), + sa.Column('buyer_message', sa.String(length=512), nullable=True), + sa.ForeignKeyConstraint(['buyer_id'], ['users.id'], ), + sa.ForeignKeyConstraint(['plant_id'], ['plant.id'], ), + sa.ForeignKeyConstraint(['seller_id'], ['users.id'], ), + sa.PrimaryKeyConstraint('id') + ) + op.add_column('plant', sa.Column('data_verified', sa.Boolean(), nullable=False)) + op.drop_column('plant_ownership_log', 'graph_node_id') + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.add_column('plant_ownership_log', sa.Column('graph_node_id', mysql.VARCHAR(length=255), nullable=True)) + op.drop_column('plant', 'data_verified') + op.drop_table('transfer_request') + # ### end Alembic commands ### diff --git a/migrations/versions/7dbb6d550055_auto.py b/migrations/versions/7dbb6d550055_auto.py new file mode 100644 index 0000000..2ec7b17 --- /dev/null +++ b/migrations/versions/7dbb6d550055_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 7dbb6d550055 +Revises: 72455429fdaf +Create Date: 2025-06-05 05:10:43.392181 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '7dbb6d550055' +down_revision = '72455429fdaf' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/85da58851d35_auto.py b/migrations/versions/85da58851d35_auto.py new file mode 100644 index 0000000..d64115a --- /dev/null +++ b/migrations/versions/85da58851d35_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 85da58851d35 +Revises: 8cd29b8fb6ec +Create Date: 2025-06-05 05:20:46.638884 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '85da58851d35' +down_revision = '8cd29b8fb6ec' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/8cd29b8fb6ec_auto.py b/migrations/versions/8cd29b8fb6ec_auto.py new file mode 100644 index 0000000..45f24d0 --- /dev/null +++ b/migrations/versions/8cd29b8fb6ec_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: 8cd29b8fb6ec +Revises: 7dbb6d550055 +Create Date: 2025-06-05 05:12:50.608338 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '8cd29b8fb6ec' +down_revision = '7dbb6d550055' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/c9495b058ab0_auto.py b/migrations/versions/c9495b058ab0_auto.py new file mode 100644 index 0000000..1e1bb15 --- /dev/null +++ b/migrations/versions/c9495b058ab0_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: c9495b058ab0 +Revises: 373571dfe134 +Create Date: 2025-06-05 09:42:35.228096 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = 'c9495b058ab0' +down_revision = '373571dfe134' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/migrations/versions/d8bfe4d4c083_auto.py b/migrations/versions/d8bfe4d4c083_auto.py new file mode 100644 index 0000000..299040c --- /dev/null +++ b/migrations/versions/d8bfe4d4c083_auto.py @@ -0,0 +1,28 @@ +"""auto + +Revision ID: d8bfe4d4c083 +Revises: c9495b058ab0 +Create Date: 2025-06-05 09:44:47.740029 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = 'd8bfe4d4c083' +down_revision = 'c9495b058ab0' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + pass + # ### end Alembic commands ### diff --git a/plugins/auth/models.py b/plugins/auth/models.py index 6570644..431ac50 100644 --- a/plugins/auth/models.py +++ b/plugins/auth/models.py @@ -15,9 +15,21 @@ class User(db.Model, UserMixin): excluded_from_analytics = db.Column(db.Boolean, default=False) created_at = db.Column(db.DateTime, default=datetime.utcnow) - # Optional: relationship to submissions - submissions = db.relationship('Submission', backref='user', lazy=True) - + # Use back_populates, not backref + submitted_submissions = db.relationship( + "Submission", + foreign_keys="Submission.user_id", + back_populates="submitter", + lazy=True + ) + + reviewed_submissions = db.relationship( + "Submission", + foreign_keys="Submission.reviewed_by", + back_populates="reviewer", + lazy=True + ) + def set_password(self, password): self.password_hash = generate_password_hash(password) diff --git a/plugins/importer/routes.py b/plugins/importer/routes.py index b0b322a..aa08b21 100644 --- a/plugins/importer/routes.py +++ b/plugins/importer/routes.py @@ -4,6 +4,7 @@ import csv import io import difflib +from datetime import datetime from flask import Blueprint, request, render_template, redirect, flash, session, url_for from flask_login import login_required, current_user from flask_wtf.csrf import generate_csrf @@ -11,109 +12,109 @@ from flask_wtf.csrf import generate_csrf from app.neo4j_utils import get_neo4j_handler from plugins.plant.models import ( db, - Plant, PlantCommonName, PlantScientificName, PlantOwnershipLog + Plant, + PlantCommonName, + PlantScientificName, + PlantOwnershipLog ) -bp = Blueprint("importer", __name__, template_folder="templates", url_prefix="/import") +bp = Blueprint('importer', __name__, template_folder='templates', url_prefix='/import') -REQUIRED_HEADERS = {"uuid", "plant_type", "name"} +# ──────────────────────────────────────────────────────────────────────────────── +# Redirect “/import/” → “/import/upload” +# ──────────────────────────────────────────────────────────────────────────────── + +@bp.route("/", methods=["GET"]) +@login_required +def index(): + # When someone hits /import, send them to /import/upload + return redirect(url_for("importer.upload")) -@bp.route("/", methods=["GET", "POST"]) +# ──────────────────────────────────────────────────────────────────────────────── +# Required CSV headers for import +# ──────────────────────────────────────────────────────────────────────────────── +REQUIRED_HEADERS = {"uuid", "plant_type", "name", "scientific_name", "mother_uuid"} + + +@bp.route("/upload", methods=["GET", "POST"]) @login_required def upload(): if request.method == "POST": file = request.files.get("file") if not file: - flash("No file uploaded.", "error") + flash("No file selected", "error") return redirect(request.url) + # Decode as UTF-8-SIG to strip any BOM, then parse with csv.DictReader try: - decoded = file.read().decode("utf-8-sig") - stream = io.StringIO(decoded) + stream = io.StringIO(file.stream.read().decode("utf-8-sig")) reader = csv.DictReader(stream) - - headers = set(reader.fieldnames or []) - missing = REQUIRED_HEADERS - headers - if missing: - flash(f"Missing required CSV headers: {missing}", "error") - return redirect(request.url) - - session["pending_rows"] = [] - review_list = [] - - # Preload existing common/scientific names - all_common = {c.name.lower(): c for c in PlantCommonName.query.all()} - all_scientific = {s.name.lower(): s for s in PlantScientificName.query.all()} - - for row in reader: - uuid_raw = row.get("uuid", "") - uuid = uuid_raw.strip().strip('"') - - name_raw = row.get("name", "") - name = name_raw.strip() - - sci_raw = row.get("scientific_name", "") - sci_name = sci_raw.strip() - - plant_type = row.get("plant_type", "").strip() or "plant" - - mother_raw = row.get("mother_uuid", "") - mother_uuid = mother_raw.strip().strip('"') - - # If any required field is missing, skip - if not (uuid and name and plant_type): - continue - - # Try fuzzy‐matching scientific names if needed - suggested_match = None - original_sci = sci_name - name_lc = name.lower() - sci_lc = sci_name.lower() - - if sci_lc and sci_lc not in all_scientific: - close = difflib.get_close_matches(sci_lc, all_scientific.keys(), n=1, cutoff=0.85) - if close: - suggested_match = all_scientific[close[0]].name - - if not sci_lc and name_lc in all_common: - sci_obj = PlantScientificName.query.filter_by(common_id=all_common[name_lc].id).first() - if sci_obj: - sci_name = sci_obj.name - elif not sci_lc: - close_common = difflib.get_close_matches(name_lc, all_common.keys(), n=1, cutoff=0.85) - if close_common: - match_name = close_common[0] - sci_obj = PlantScientificName.query.filter_by(common_id=all_common[match_name].id).first() - if sci_obj: - suggested_match = sci_obj.name - sci_name = sci_obj.name - - session["pending_rows"].append({ - "uuid": uuid, - "name": name, - "sci_name": sci_name, - "original_sci_name": original_sci, - "plant_type": plant_type, - "mother_uuid": mother_uuid, - "suggested_scientific_name": suggested_match, - }) - - if suggested_match and suggested_match != original_sci: - review_list.append({ - "uuid": uuid, - "common_name": name, - "user_input": original_sci or "(blank)", - "suggested_name": suggested_match - }) - - session["review_list"] = review_list - return redirect(url_for("importer.review")) - - except Exception as e: - flash(f"Import failed: {e}", "error") + except Exception: + flash("Failed to read CSV file. Ensure it is valid UTF-8.", "error") return redirect(request.url) + headers = set(reader.fieldnames or []) + missing = REQUIRED_HEADERS - headers + if missing: + flash(f"Missing required CSV headers: {missing}", "error") + return redirect(request.url) + + # Prepare session storage for the rows under review + session["pending_rows"] = [] + review_list = [] + + # Preload existing common/scientific names (lowercased keys for fuzzy matching) + all_common = {c.name.lower(): c for c in PlantCommonName.query.all()} + all_scientific = {s.name.lower(): s for s in PlantScientificName.query.all()} + + for row in reader: + uuid_raw = row.get("uuid", "") + uuid = uuid_raw.strip().strip('"') + + name_raw = row.get("name", "") + name = name_raw.strip() + + sci_raw = row.get("scientific_name", "") + sci_name = sci_raw.strip() + + plant_type = row.get("plant_type", "").strip() or "plant" + + mother_raw = row.get("mother_uuid", "") + mother_uuid = mother_raw.strip().strip('"') + + # Skip any row where required fields are missing + if not (uuid and name and plant_type): + continue + + # ─── If the scientific name doesn’t match exactly, suggest a close match ───── + # Only suggest if the “closest key” differs from the raw input: + suggestions = difflib.get_close_matches( + sci_name.lower(), + list(all_scientific.keys()), + n=1, + cutoff=0.8 + ) + if suggestions and suggestions[0] != sci_name.lower(): + suggested = all_scientific[suggestions[0]].name + else: + suggested = None + + review_item = { + "uuid": uuid, + "name": name, + "sci_name": sci_name, + "suggested": suggested, + "plant_type": plant_type, + "mother_uuid": mother_uuid + } + review_list.append(review_item) + session["pending_rows"].append(review_item) + + session["review_list"] = review_list + return redirect(url_for("importer.review")) + + # GET → show upload form return render_template("importer/upload.html", csrf_token=generate_csrf()) @@ -127,107 +128,97 @@ def review(): neo = get_neo4j_handler() added = 0 - # ————————————————————————————————————————————— - # (1) CREATE MySQL records & MERGE every Neo4j node - # ————————————————————————————————————————————— + # Re-load preload maps to avoid NameError if used below + all_common = {c.name.lower(): c for c in PlantCommonName.query.all()} + all_scientific = {s.name.lower(): s for s in PlantScientificName.query.all()} + for row in rows: - uuid_raw = row["uuid"] - uuid = uuid_raw.strip().strip('"') + uuid = row.get("uuid") + name = row.get("name") + sci_name = row.get("sci_name") + suggested = row.get("suggested") + plant_type = row.get("plant_type") + mother_uuid = row.get("mother_uuid") - name_raw = row["name"] - name = name_raw.strip() + # Check if user clicked "confirm" for a suggested scientific name + accepted_key = f"confirm_{uuid}" + accepted = request.form.get(accepted_key) - sci_raw = row["sci_name"] - sci_name = sci_raw.strip() - - plant_type = row["plant_type"].strip() - - mother_raw = row["mother_uuid"] - mother_uuid = mother_raw.strip().strip('"') - - suggested = row.get("suggested_scientific_name") - - # ——— MySQL: PlantCommonName ——— + # ─── MySQL: PlantCommonName ──────────────────────────────────────────────── common = PlantCommonName.query.filter_by(name=name).first() if not common: common = PlantCommonName(name=name) db.session.add(common) db.session.flush() + all_common[common.name.lower()] = common + else: + all_common[common.name.lower()] = common - # ——— MySQL: PlantScientificName ——— - accepted = request.form.get(f"confirm_{uuid}") + # ─── MySQL: PlantScientificName ─────────────────────────────────────────── sci_to_use = suggested if (suggested and accepted) else sci_name - scientific = PlantScientificName.query.filter_by(name=sci_to_use).first() if not scientific: - scientific = PlantScientificName(name=sci_to_use, common_id=common.id) + scientific = PlantScientificName( + name = sci_to_use, + common_id = common.id + ) db.session.add(scientific) db.session.flush() + all_scientific[scientific.name.lower()] = scientific + else: + all_scientific[scientific.name.lower()] = scientific - # ——— MySQL: Plant row ——— + # ─── Decide if this plant’s data is “verified” by the user ──────────────── + data_verified = False + if (not suggested) or (suggested and accepted): + data_verified = True + + # ─── MySQL: Plant record ───────────────────────────────────────────────── plant = Plant.query.filter_by(uuid=uuid).first() if not plant: plant = Plant( - uuid=uuid, - common_id=common.id, - scientific_id=scientific.id, - plant_type=plant_type, - owner_id=current_user.id, - is_verified=bool(accepted) + uuid = uuid, + common_id = common.id, + scientific_id = scientific.id, + plant_type = plant_type, + owner_id = current_user.id, + data_verified = data_verified ) db.session.add(plant) - db.session.flush() # so plant.id is available immediately - added += 1 + db.session.flush() # so plant.id is now available - # ——— MySQL: Create initial ownership log entry ——— log = PlantOwnershipLog( plant_id = plant.id, user_id = current_user.id, date_acquired = datetime.utcnow(), transferred = False, - is_verified = bool(accepted) + is_verified = data_verified ) db.session.add(log) + added += 1 + else: + # Skip duplicates if the same UUID already exists + pass - # ——— Neo4j: ensure a node exists for this plant UUID ——— + # ─── Neo4j: ensure the Plant node exists ───────────────────────────────── neo.create_plant_node(uuid, name) - # Commit MySQL so that all Plant/OwnershipLog rows exist - db.session.commit() - - - # ————————————————————————————————————————————— - # (2) CREATE Neo4j LINEAGE relationships (child → parent). (Unchanged) - # ————————————————————————————————————————————— - for row in rows: - child_raw = row.get("uuid", "") - child_uuid = child_raw.strip().strip('"') - - mother_raw = row.get("mother_uuid", "") - mother_uuid = mother_raw.strip().strip('"') - - print( - f"[DEBUG] row → child_raw={child_raw!r}, child_uuid={child_uuid!r}; " - f"mother_raw={mother_raw!r}, mother_uuid={mother_uuid!r}" - ) - + # ─── Neo4j: create a LINEAGE relationship if mother_uuid was provided ───── if mother_uuid: - neo.create_plant_node(mother_uuid, name="Unknown") - neo.create_lineage(child_uuid, mother_uuid) - else: - print(f"[DEBUG] Skipping LINEAGE creation for child {child_uuid!r} (no mother_uuid)") - - # (Optional) Check two known UUIDs - neo.debug_check_node("8b1059c8-8dd3-487a-af19-1eb548788e87") - neo.debug_check_node("2ee2e0e7-69de-4d8f-abfe-4ed973c3d760") + # Replace the old call with the correct method name: + neo.create_lineage(child_uuid=uuid, parent_uuid=mother_uuid) + # Commit all MySQL changes at once + db.session.commit() neo.close() - flash(f"{added} plants added (MySQL) + Neo4j nodes/relations created.", "success") + + flash(f"{added} plants added (MySQL) and Neo4j nodes/relationships created.", "success") session.pop("pending_rows", None) session.pop("review_list", None) return redirect(url_for("importer.upload")) + # GET → re-render the review page with the same review_list return render_template( "importer/review.html", review_list=review_list, diff --git a/plugins/importer/templates/importer/review.html b/plugins/importer/templates/importer/review.html index 45eb29c..3979317 100644 --- a/plugins/importer/templates/importer/review.html +++ b/plugins/importer/templates/importer/review.html @@ -1,17 +1,39 @@ {% extends "core_ui/base.html" %} -{% block title %}Review Matches{% endblock %} +{% block title %}Review Suggested Matches{% endblock %} + {% block content %}

🔍 Review Suggested Matches

-
- - {% if review_list %} -

Confirm the suggested scientific name replacements below. Only confirmed matches will override user input.

- +

+ Confirm the suggested scientific‐name replacements below. + Only checked boxes (“Confirm”) will override the raw user input. +

+ + {# Display flash messages (error, success, etc.) #} + {% with messages = get_flashed_messages(with_categories=true) %} + {% if messages %} + {% for category, message in messages %} + + {% endfor %} + {% endif %} + {% endwith %} + + {% if review_list and review_list|length > 0 %} + + {# Hidden CSRF token #} + + +
- + @@ -19,20 +41,32 @@ {% for row in review_list %} - - - + + + {% endfor %}
Common NameUser InputUser Input (Scientific Name) Suggested Match Confirm
{{ row.common_name }}{{ row.user_input }}{{ row.suggested_name }}{{ row.name }}{{ row.sci_name }}{{ row.suggested or '-' }} - + {% if row.suggested %} + + {% else %} + — + {% endif %}
- {% else %} -

No matches found that need confirmation.

- {% endif %} - -
+ + + Cancel + + {% else %} +
+ No rows to review. Upload another CSV? +
+ {% endif %}
{% endblock %} diff --git a/plugins/importer/templates/importer/upload.html b/plugins/importer/templates/importer/upload.html index 3ff1139..95ced18 100644 --- a/plugins/importer/templates/importer/upload.html +++ b/plugins/importer/templates/importer/upload.html @@ -1,28 +1,45 @@ {% extends "core_ui/base.html" %} {% block title %}CSV Import{% endblock %} + {% block content %}

📤 Import Plant Data

+ + {# Display flash messages (error, success, etc.) #} {% with messages = get_flashed_messages(with_categories=true) %} {% if messages %} {% for category, message in messages %} - diff --git a/plugins/media/routes.py b/plugins/media/routes.py index dd684f4..6f9a40b 100644 --- a/plugins/media/routes.py +++ b/plugins/media/routes.py @@ -1,5 +1,78 @@ -from flask import Blueprint +from flask import Blueprint, render_template, request, redirect, url_for, flash, jsonify, send_from_directory, current_app +from flask_login import current_user, login_required +from werkzeug.utils import secure_filename +import os +from app import db +from .models import Media, ImageHeart, FeaturedImage +from plugins.plant.models import Plant -media_bp = Blueprint('media', __name__) +bp = Blueprint("media", __name__, template_folder="templates") -# Add routes here as needed; do NOT define models here. +UPLOAD_FOLDER = "static/uploads" +ALLOWED_EXTENSIONS = {"png", "jpg", "jpeg", "gif"} + +def allowed_file(filename): + return "." in filename and filename.rsplit(".", 1)[1].lower() in ALLOWED_EXTENSIONS + +@bp.route("/media/upload", methods=["GET", "POST"]) +@login_required +def upload_media(): + if request.method == "POST": + file = request.files.get("image") + caption = request.form.get("caption") + plant_id = request.form.get("plant_id") + + if file and allowed_file(file.filename): + filename = secure_filename(file.filename) + save_path = os.path.join(current_app.root_path, UPLOAD_FOLDER) + os.makedirs(save_path, exist_ok=True) + file.save(os.path.join(save_path, filename)) + + media = Media(file_url=f"{UPLOAD_FOLDER}/{filename}", caption=caption, plant_id=plant_id) + db.session.add(media) + db.session.commit() + + flash("Image uploaded successfully.", "success") + return redirect(url_for("media.upload_media")) + else: + flash("Invalid file or no file uploaded.", "danger") + + return render_template("media/upload.html") + +@bp.route("/media/files/") +def media_file(filename): + return send_from_directory(os.path.join(current_app.root_path, "static/uploads"), filename) + +@bp.route("/media/heart/", methods=["POST"]) +@login_required +def toggle_heart(image_id): + existing = ImageHeart.query.filter_by(user_id=current_user.id, submission_image_id=image_id).first() + if existing: + db.session.delete(existing) + db.session.commit() + return jsonify({"status": "unhearted"}) + else: + heart = ImageHeart(user_id=current_user.id, submission_image_id=image_id) + db.session.add(heart) + db.session.commit() + return jsonify({"status": "hearted"}) + +@bp.route("/media/feature/", methods=["POST"]) +@login_required +def set_featured_image(image_id): + image = Media.query.get_or_404(image_id) + plant = image.plant + if not plant: + flash("This image is not linked to a plant.", "danger") + return redirect(request.referrer or url_for("core_ui.home")) + + if current_user.id != plant.owner_id and current_user.role != "admin": + flash("Not authorized to set featured image.", "danger") + return redirect(request.referrer or url_for("core_ui.home")) + + FeaturedImage.query.filter_by(submission_image_id=image_id).delete() + featured = FeaturedImage(submission_image_id=image_id, is_featured=True) + db.session.add(featured) + db.session.commit() + flash("Image set as featured.", "success") + return redirect(request.referrer or url_for("core_ui.home")) diff --git a/plugins/plant/models.py b/plugins/plant/models.py index f0a94fa..14e7bec 100644 --- a/plugins/plant/models.py +++ b/plugins/plant/models.py @@ -2,18 +2,11 @@ from datetime import datetime import uuid as uuid_lib - -# Import the central SQLAlchemy instance, not a new one from app import db +# from plugins.auth.models import User -# If your User model lives in plugins/auth/models.py, import it here: -from plugins.auth.models import User -# ----------------------------- -# (We no longer need PlantLineage) -# ----------------------------- - -# Association table for tags (unchanged) +# Association table for Plant ↔ Tag (unchanged) plant_tags = db.Table( 'plant_tags', db.metadata, @@ -30,7 +23,6 @@ class Tag(db.Model): name = db.Column(db.String(128), unique=True, nullable=False) # … any other columns you had … - class PlantCommonName(db.Model): __tablename__ = 'plant_common_name' __table_args__ = {'extend_existing': True} @@ -46,7 +38,6 @@ class PlantCommonName(db.Model): cascade='all, delete-orphan' ) - class PlantScientificName(db.Model): __tablename__ = 'plant_scientific_name' __table_args__ = {'extend_existing': True} @@ -56,12 +47,8 @@ class PlantScientificName(db.Model): common_id = db.Column(db.Integer, db.ForeignKey('plant_common_name.id'), nullable=False) created_at = db.Column(db.DateTime, default=datetime.utcnow) - plants = db.relationship( - 'plugins.plant.models.Plant', - backref='scientific', - lazy='dynamic' - ) - + # We removed the “plants” relationship from here to avoid backref conflicts. + # If you need it, you can still do Plant.query.filter_by(scientific_id=). class PlantOwnershipLog(db.Model): __tablename__ = 'plant_ownership_log' @@ -72,11 +59,16 @@ class PlantOwnershipLog(db.Model): user_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False) date_acquired = db.Column(db.DateTime, default=datetime.utcnow) transferred = db.Column(db.Boolean, default=False, nullable=False) - graph_node_id = db.Column(db.String(255), nullable=True) # optional is_verified = db.Column(db.Boolean, default=False, nullable=False) - user = db.relationship('plugins.auth.models.User', backref='ownership_logs', lazy=True) + # Optional: if you ever want to store a pointer to the Neo4j node, you can re-add: + # graph_node_id = db.Column(db.String(255), nullable=True) + user = db.relationship( + 'plugins.auth.models.User', + backref='ownership_logs', + lazy=True + ) class Plant(db.Model): __tablename__ = 'plant' @@ -94,6 +86,10 @@ class Plant(db.Model): created_at = db.Column(db.DateTime, default=datetime.utcnow) updated_at = db.Column(db.DateTime, onupdate=datetime.utcnow) + # ─── NEW: Flag that indicates whether the common/scientific name pair was human-verified ───────────────── + data_verified = db.Column(db.Boolean, default=False, nullable=False) + + # Relationships updates = db.relationship( 'plugins.growlog.models.PlantUpdate', backref='plant', diff --git a/plugins/submission/models.py b/plugins/submission/models.py index fc83ee2..4f54289 100644 --- a/plugins/submission/models.py +++ b/plugins/submission/models.py @@ -1,39 +1,35 @@ from datetime import datetime from app import db from plugins.plant.models import Plant - +from plugins.auth.models import User class Submission(db.Model): - __tablename__ = 'submissions' - __table_args__ = {'extend_existing': True} + __tablename__ = "submissions" + __table_args__ = {"extend_existing": True} id = db.Column(db.Integer, primary_key=True) - user_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False) - plant_id = db.Column(db.Integer, db.ForeignKey('plant.id'), nullable=True) + user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=False) + submitted_at = db.Column(db.DateTime, default=datetime.utcnow) + plant_name = db.Column(db.String(100), nullable=False) + scientific_name = db.Column(db.String(120), nullable=True) + notes = db.Column(db.Text, nullable=True) + approved = db.Column(db.Boolean, default=None) + approved_at = db.Column(db.DateTime, nullable=True) + reviewed_by = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True) - common_name = db.Column(db.String(120), nullable=False) - scientific_name = db.Column(db.String(120)) - price = db.Column(db.Float, nullable=False) - source = db.Column(db.String(120)) - timestamp = db.Column(db.DateTime, default=datetime.utcnow) + # Explicit bidirectional relationships + submitter = db.relationship("User", foreign_keys=[user_id], back_populates="submitted_submissions") + reviewer = db.relationship("User", foreign_keys=[reviewed_by], back_populates="reviewed_submissions") - height = db.Column(db.Float) - width = db.Column(db.Float) - leaf_count = db.Column(db.Integer) - potting_mix = db.Column(db.String(255)) - container_size = db.Column(db.String(120)) - health_status = db.Column(db.String(50)) - notes = db.Column(db.Text) + images = db.relationship("SubmissionImage", backref="submission", lazy=True) - # Image references via SubmissionImage table - images = db.relationship('SubmissionImage', backref='submission', lazy=True) class SubmissionImage(db.Model): - __tablename__ = 'submission_images' - __table_args__ = {'extend_existing': True} + __tablename__ = "submission_images" + __table_args__ = {"extend_existing": True} id = db.Column(db.Integer, primary_key=True) - submission_id = db.Column(db.Integer, db.ForeignKey('submissions.id'), nullable=False) - file_path = db.Column(db.String(255), nullable=False) - is_visible = db.Column(db.Boolean, default=True) + submission_id = db.Column(db.Integer, db.ForeignKey("submissions.id"), nullable=False) + file_url = db.Column(db.String(256), nullable=False) + uploaded_at = db.Column(db.DateTime, default=datetime.utcnow) diff --git a/plugins/submission/routes.py b/plugins/submission/routes.py index e0937e2..566a629 100644 --- a/plugins/submission/routes.py +++ b/plugins/submission/routes.py @@ -1,10 +1,78 @@ -from flask import Blueprint, render_template -from .models import Submission +from flask import Blueprint, render_template, request, redirect, url_for, flash, jsonify, send_from_directory, current_app +from flask_login import current_user, login_required +from werkzeug.utils import secure_filename +import os from app import db +from .models import SubmissionImage, ImageHeart, FeaturedImage +from plugins.plant.models import Plant -bp = Blueprint('submission', __name__, url_prefix='/submission') +bp = Blueprint("submission", __name__, template_folder="templates") -@bp.route('/') -def index(): - submissions = Submission.query.order_by(Submission.timestamp.desc()).all() - return render_template('submission/index.html', submissions=submissions) +UPLOAD_FOLDER = "static/uploads" +ALLOWED_EXTENSIONS = {"png", "jpg", "jpeg", "gif"} + +def allowed_file(filename): + return "." in filename and filename.rsplit(".", 1)[1].lower() in ALLOWED_EXTENSIONS + +@bp.route("/submissions/upload", methods=["GET", "POST"]) +@login_required +def upload_submissions(): + if request.method == "POST": + file = request.files.get("image") + caption = request.form.get("caption") + plant_id = request.form.get("plant_id") + + if file and allowed_file(file.filename): + filename = secure_filename(file.filename) + save_path = os.path.join(current_app.root_path, UPLOAD_FOLDER) + os.makedirs(save_path, exist_ok=True) + file.save(os.path.join(save_path, filename)) + + submissions = SubmissionImage(file_url=f"{UPLOAD_FOLDER}/{filename}", caption=caption, plant_id=plant_id) + db.session.add(submissions) + db.session.commit() + + flash("Image uploaded successfully.", "success") + return redirect(url_for("submissions.upload_submissions")) + else: + flash("Invalid file or no file uploaded.", "danger") + + return render_template("submissions/upload.html") + +@bp.route("/submissions/files/") +def submissions_file(filename): + return send_from_directory(os.path.join(current_app.root_path, "static/uploads"), filename) + +@bp.route("/submissions/heart/", methods=["POST"]) +@login_required +def toggle_heart(image_id): + existing = ImageHeart.query.filter_by(user_id=current_user.id, submission_image_id=image_id).first() + if existing: + db.session.delete(existing) + db.session.commit() + return jsonify({"status": "unhearted"}) + else: + heart = ImageHeart(user_id=current_user.id, submission_image_id=image_id) + db.session.add(heart) + db.session.commit() + return jsonify({"status": "hearted"}) + +@bp.route("/submissions/feature/", methods=["POST"]) +@login_required +def set_featured_image(image_id): + image = SubmissionImage.query.get_or_404(image_id) + plant = image.plant + if not plant: + flash("This image is not linked to a plant.", "danger") + return redirect(request.referrer or url_for("core_ui.home")) + + if current_user.id != plant.owner_id and current_user.role != "admin": + flash("Not authorized to set featured image.", "danger") + return redirect(request.referrer or url_for("core_ui.home")) + + FeaturedImage.query.filter_by(submission_image_id=image_id).delete() + featured = FeaturedImage(submission_image_id=image_id, is_featured=True) + db.session.add(featured) + db.session.commit() + flash("Image set as featured.", "success") + return redirect(request.referrer or url_for("core_ui.home")) \ No newline at end of file diff --git a/plugins/transfer/__init__.py b/plugins/transfer/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/plugins/transfer/models.py b/plugins/transfer/models.py new file mode 100644 index 0000000..96c26eb --- /dev/null +++ b/plugins/transfer/models.py @@ -0,0 +1,41 @@ +from datetime import datetime +from app import db + +class TransferRequest(db.Model): + __tablename__ = 'transfer_request' + __table_args__ = {'extend_existing': True} + + id = db.Column(db.Integer, primary_key=True, autoincrement=True) + plant_id = db.Column(db.Integer, db.ForeignKey('plant.id'), nullable=False) + seller_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False) + buyer_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False) + status = db.Column( + db.String(20), + nullable=False, + default='pending' + ) + created_at = db.Column(db.DateTime, default=datetime.utcnow) + updated_at = db.Column(db.DateTime, onupdate=datetime.utcnow) + seller_message = db.Column(db.String(512), nullable=True) + buyer_message = db.Column(db.String(512), nullable=True) + + plant = db.relationship( + 'plugins.plant.models.Plant', + backref=db.backref('transfer_requests', lazy='dynamic'), + lazy=True + ) + seller = db.relationship( + 'plugins.auth.models.User', + foreign_keys=[seller_id], + backref='outgoing_transfers', + lazy=True + ) + buyer = db.relationship( + 'plugins.auth.models.User', + foreign_keys=[buyer_id], + backref='incoming_transfers', + lazy=True + ) + + def __repr__(self): + return f"" diff --git a/plugins/transfer/plugin.json b/plugins/transfer/plugin.json new file mode 100644 index 0000000..13ee77b --- /dev/null +++ b/plugins/transfer/plugin.json @@ -0,0 +1,6 @@ +{ + "name": "transfer", + "version": "1.0.0", + "description": "Handles plant transfer requests between users", + "entry_point": "" +} diff --git a/plugins/transfer/routes.py b/plugins/transfer/routes.py new file mode 100644 index 0000000..f0c17a4 --- /dev/null +++ b/plugins/transfer/routes.py @@ -0,0 +1,152 @@ +from datetime import datetime +from flask import Blueprint, request, redirect, url_for, flash, render_template +from flask_login import login_required, current_user + +from plugins.plant.models import db, Plant, PlantOwnershipLog +from plugins.transfer.models import TransferRequest +from plugins.auth.models import User + +bp = Blueprint('transfer', __name__, template_folder='templates', url_prefix='/transfer') + +@bp.route('/request/', methods=['GET', 'POST']) +@login_required +def request_transfer(plant_id): + plant = Plant.query.get_or_404(plant_id) + + if plant.owner_id == current_user.id: + seller = current_user + if request.method == 'POST': + buyer_id = request.form.get('buyer_id', type=int) + buyer = User.query.get(buyer_id) + if not buyer or buyer.id == seller.id: + flash("Please select a valid buyer.", "error") + return redirect(request.url) + + tr = TransferRequest( + plant_id=plant.id, + seller_id=seller.id, + buyer_id=buyer.id, + status='pending', + seller_message=request.form.get('seller_message', '').strip() + ) + db.session.add(tr) + db.session.commit() + flash("Transfer request sent to buyer. Waiting for their approval.", "info") + return redirect(url_for('plant.view', plant_id=plant.id)) + + all_users = User.query.filter(User.id != seller.id).all() + return render_template( + 'transfer/request_transfer.html', + plant=plant, + all_users=all_users + ) + + else: + buyer = current_user + if request.method == 'POST': + seller_id = request.form.get('seller_id', type=int) + seller = User.query.get(seller_id) + if not seller or seller.id != plant.owner_id: + flash("Please select the correct seller (current owner).", "error") + return redirect(request.url) + + tr = TransferRequest( + plant_id=plant.id, + seller_id=seller.id, + buyer_id=buyer.id, + status='pending', + buyer_message=request.form.get('buyer_message', '').strip() + ) + db.session.add(tr) + db.session.commit() + flash("Transfer request sent to seller. Waiting for their approval.", "info") + return redirect(url_for('plant.view', plant_id=plant.id)) + + return render_template( + 'transfer/request_transfer.html', + plant=plant, + all_users=[User.query.get(plant.owner_id)] + ) + +@bp.route('/incoming', methods=['GET']) +@login_required +def incoming_requests(): + pending = TransferRequest.query.filter_by( + buyer_id=current_user.id, + status='pending' + ).all() + return render_template('transfer/incoming.html', pending=pending) + +@bp.route('/approve/', methods=['POST']) +@login_required +def approve_request(request_id): + tr = TransferRequest.query.get_or_404(request_id) + if current_user.id not in (tr.seller_id, tr.buyer_id): + flash("You’re not authorized to approve this transfer.", "error") + return redirect(url_for('transfer.incoming_requests')) + + if current_user.id == tr.buyer_id: + tr.status = 'buyer_approved' + tr.buyer_message = request.form.get('message', tr.buyer_message) + else: + tr.status = 'seller_approved' + tr.seller_message = request.form.get('message', tr.seller_message) + + tr.updated_at = datetime.utcnow() + db.session.commit() + flash("You have approved the transfer. Waiting on the other party.", "info") + return redirect(url_for('transfer.incoming_requests')) + +@bp.route('/finalize/', methods=['POST']) +@login_required +def finalize_request(request_id): + tr = TransferRequest.query.get_or_404(request_id) + + buyer_approved = ( + TransferRequest.query + .filter_by(id=tr.id, buyer_id=tr.buyer_id, status='buyer_approved') + .first() is not None + ) + seller_approved = ( + TransferRequest.query + .filter_by(id=tr.id, seller_id=tr.seller_id, status='seller_approved') + .first() is not None + ) + + if not (buyer_approved and seller_approved): + flash("Both parties must approve before finalizing.", "error") + return redirect(url_for('transfer.incoming_requests')) + + new_log = PlantOwnershipLog( + plant_id=tr.plant_id, + user_id=tr.buyer_id, + date_acquired=datetime.utcnow(), + transferred=True, + is_verified=True + ) + db.session.add(new_log) + + plant = Plant.query.get(tr.plant_id) + plant.owner_id = tr.buyer_id + + tr.status = 'complete' + tr.updated_at = datetime.utcnow() + + db.session.commit() + flash("Transfer finalized—ownership updated.", "success") + return redirect(url_for('plant.view', plant_id=tr.plant_id)) + +@bp.route('/reject/', methods=['POST']) +@login_required +def reject_request(request_id): + tr = TransferRequest.query.get_or_404(request_id) + + if current_user.id not in (tr.seller_id, tr.buyer_id): + flash("You’re not authorized to reject this transfer.", "error") + return redirect(url_for('transfer.incoming_requests')) + + tr.status = 'rejected' + tr.updated_at = datetime.utcnow() + db.session.commit() + flash("Transfer request has been rejected.", "warning") + return redirect(url_for('transfer.incoming_requests')) diff --git a/plugins/transfer/templates/transfer/incoming.html b/plugins/transfer/templates/transfer/incoming.html new file mode 100644 index 0000000..18e6da4 --- /dev/null +++ b/plugins/transfer/templates/transfer/incoming.html @@ -0,0 +1,25 @@ +{% extends 'core_ui/base.html' %} + +{% block content %} +

Incoming Transfer Requests

+ {% if pending %} +
    + {% for tr in pending %} +
  • + Plant: {{ tr.plant.custom_slug or tr.plant.uuid }} | + From: {{ tr.seller.username }} | +
    + {% csrf_token %} + +
    +
    + {% csrf_token %} + +
    +
  • + {% endfor %} +
+ {% else %} +

No pending requests.

+ {% endif %} +{% endblock %} diff --git a/plugins/transfer/templates/transfer/request_transfer.html b/plugins/transfer/templates/transfer/request_transfer.html new file mode 100644 index 0000000..97ffe1b --- /dev/null +++ b/plugins/transfer/templates/transfer/request_transfer.html @@ -0,0 +1,28 @@ +{% extends 'core_ui/base.html' %} + +{% block content %} +

Request Transfer: {{ plant.custom_slug or plant.uuid }}

+
+ {% csrf_token %} + {% if plant.owner_id == current_user.id %} + + +
+
+
+ {% else %} + + +
+
+
+ {% endif %} + +
+{% endblock %}