Internal audit - ozymandias edition :D
This commit is contained in:
parent
00c2cf6256
commit
9496c9061c
|
|
@ -63,6 +63,16 @@ static/models/
|
|||
# Downloaded by `make build` when content/photography/ exists.
|
||||
static/leaflet/
|
||||
|
||||
# Vendored Mozilla PDF.js viewer (~7 MB extracted, pinned in
|
||||
# tools/download-pdfjs.sh). Used by Filters/Links.hs (rewrites .pdf links)
|
||||
# and Filters/EmbedPdf.hs ({{pdf:...}} directives).
|
||||
# Downloaded by `make build` when static/papers/ exists.
|
||||
static/pdfjs/
|
||||
|
||||
# Auto-generated PDF first-page thumbnails (foo.thumb.png alongside foo.pdf).
|
||||
# Produced by the pdf-thumbs Make target via pdftoppm.
|
||||
static/papers/*.thumb.png
|
||||
|
||||
# Generated WebP companions (produced by tools/convert-images.sh at build time).
|
||||
# To intentionally commit a WebP, use: git add -f path/to/file.webp
|
||||
static/**/*.webp
|
||||
|
|
|
|||
|
|
@ -0,0 +1,21 @@
|
|||
MIT License
|
||||
|
||||
Copyright (c) 2026 Levi Neuwirth
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
13
Makefile
13
Makefile
|
|
@ -1,6 +1,6 @@
|
|||
.PHONY: build deploy sign download-model download-leaflet convert-images pdf-thumbs watch clean dev
|
||||
.PHONY: build deploy sign download-model download-leaflet download-pdfjs convert-images pdf-thumbs watch clean dev
|
||||
|
||||
# Source .env for GITHUB_TOKEN and GITHUB_REPO if it exists.
|
||||
# Source .env for VPS_USER / VPS_HOST / VPS_PATH (consumed by `make deploy`).
|
||||
# .env format: KEY=value (one per line, no `export` prefix, no quotes needed).
|
||||
-include .env
|
||||
export
|
||||
|
|
@ -9,6 +9,7 @@ build:
|
|||
@date +%s > data/build-start.txt
|
||||
@./tools/convert-images.sh
|
||||
@$(MAKE) -s pdf-thumbs
|
||||
@if [ -d static/papers ]; then ./tools/download-pdfjs.sh; fi
|
||||
@if [ -d content/photography ]; then ./tools/download-leaflet.sh; fi
|
||||
# Photography pipeline: when content/photography/ exists, generate
|
||||
# per-photo EXIF + palette sidecars and per-image dimension sidecars
|
||||
|
|
@ -48,6 +49,14 @@ download-model:
|
|||
download-leaflet:
|
||||
@./tools/download-leaflet.sh
|
||||
|
||||
# Vendor Mozilla's prebuilt PDF.js viewer into static/pdfjs/.
|
||||
# Used by build/Filters/Links.hs (rewrites .pdf links) and build/Filters/EmbedPdf.hs
|
||||
# ({{pdf:...}} directives) — both target /pdfjs/web/viewer.html. Runs automatically
|
||||
# as part of `build` when static/papers/ exists (skips when already extracted).
|
||||
# Files are gitignored; sha256-verified against tools/pdfjs-checksums.sha256.
|
||||
download-pdfjs:
|
||||
@./tools/download-pdfjs.sh
|
||||
|
||||
# Convert JPEG/PNG images to WebP companions (also runs automatically in build).
|
||||
# Requires cwebp: pacman -S libwebp / apt install webp
|
||||
convert-images:
|
||||
|
|
|
|||
13
README.md
13
README.md
|
|
@ -16,6 +16,7 @@ A full-featured static site framework built with [Hakyll](https://jaspervdj.be/h
|
|||
- **Semantic search** — optional embedding pipeline (sentence-transformers + FAISS) for "similar links."
|
||||
- **Settings** — dark mode, text size, focus mode, reduce motion.
|
||||
- **Wikilinks** — `[[Page Name]]` and `[[Page Name|display text]]` syntax.
|
||||
- **PDF embeds** — drop a PDF into `static/papers/`, then `{{pdf:/papers/foo.pdf}}` (or `#5` for a starting page) renders it inline via a vendored PDF.js viewer; `[Foo](/papers/foo.pdf)` links auto-rewrite to the same viewer. First-page thumbnails are auto-generated when `pdftoppm` (poppler) is available.
|
||||
- **Atom feeds** — site-wide and per-section (e.g., music-only, photography-only).
|
||||
- **Library** — configurable portal taxonomy that groups content by tag hierarchy.
|
||||
- **Version history** — git-derived stability heuristic with manual history annotations.
|
||||
|
|
@ -35,8 +36,16 @@ $EDITOR site.yaml
|
|||
make dev
|
||||
```
|
||||
|
||||
`make dev` builds with drafts visible and starts a local server on `:8000`.
|
||||
For production: `make build` (one-shot build into `_site/`).
|
||||
### Build commands
|
||||
|
||||
| Command | What it does |
|
||||
|:--------------|:---------------------------------------------------------------------------------------------------------------|
|
||||
| `make dev` | Fast iteration build with drafts visible (`SITE_ENV=dev`), then `python3 -m http.server :8000`. Skips pagefind, image conversion, EXIF/palette/dimension extraction, embeddings, and signing — search and similar-links are inactive. |
|
||||
| `make watch` | Same draft-visible mode as `dev`, but uses Hakyll's `watch` server with auto-rebuild on source changes. Same skip list as `dev`. |
|
||||
| `make build` | Full production build into `_site/`: WebP conversion, PDF thumbnails, photography sidecars (when `content/photography/` exists), Hakyll, pagefind, embeddings. |
|
||||
| `make sign` | Detach-sign every `.html` in `_site/` with the GPG key configured in `site.yaml`'s `gpg-fingerprint`. Requires `tools/preset-signing-passphrase.sh` to have cached the passphrase. |
|
||||
| `make deploy` | `clean` + `build` + `sign` + rsync to `$VPS_USER@$VPS_HOST:$VPS_PATH/` + `git push`. VPS vars come from `.env` (gitignored). |
|
||||
| `make clean` | Hakyll clean (removes `_site/` and `_cache/`). |
|
||||
|
||||
## Prerequisites
|
||||
|
||||
|
|
|
|||
|
|
@ -32,12 +32,10 @@ import Data.Ord (comparing)
|
|||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.Map.Strict as Map
|
||||
import Data.Map.Strict (Map)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Lazy as TL
|
||||
import qualified Data.Text.Lazy.Encoding as TLE
|
||||
import qualified Data.Text.Encoding as TE
|
||||
import qualified Data.Text.Encoding.Error as TE
|
||||
import qualified Data.Aeson as Aeson
|
||||
import Data.Aeson ((.=))
|
||||
import Text.Pandoc.Class (runPure)
|
||||
|
|
@ -50,6 +48,7 @@ import Hakyll
|
|||
import Compilers (readerOpts, writerOpts)
|
||||
import Filters (preprocessSource)
|
||||
import qualified Patterns as P
|
||||
import Utils (normaliseUrl)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Link-with-context entry (intermediate, saved by the "links" pass)
|
||||
|
|
@ -182,44 +181,6 @@ linksCompiler = do
|
|||
(extractLinksWithContext (itemBody pandocItem))
|
||||
makeItem . TL.unpack . TLE.decodeUtf8 . Aeson.encode $ entries
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- URL normalisation
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
||||
-- | Normalise an internal URL as a map key: strip query string, fragment,
|
||||
-- and trailing @.html@; ensure a leading slash; percent-decode the path
|
||||
-- so that @\/essays\/caf%C3%A9@ and @\/essays\/café@ collide on the same
|
||||
-- key.
|
||||
normaliseUrl :: String -> String
|
||||
normaliseUrl url =
|
||||
let t = T.pack url
|
||||
t1 = fst (T.breakOn "?" (fst (T.breakOn "#" t)))
|
||||
t2 = if T.isPrefixOf "/" t1 then t1 else "/" `T.append` t1
|
||||
t3 = fromMaybe t2 (T.stripSuffix ".html" t2)
|
||||
in percentDecode (T.unpack t3)
|
||||
|
||||
-- | Decode percent-escapes (@%XX@) into raw bytes, then re-interpret the
|
||||
-- resulting bytestring as UTF-8. Invalid escapes are passed through
|
||||
-- verbatim so this is safe to call on already-decoded input.
|
||||
percentDecode :: String -> String
|
||||
percentDecode = T.unpack . TE.decodeUtf8With lenientDecode . pack . go
|
||||
where
|
||||
go [] = []
|
||||
go ('%':a:b:rest)
|
||||
| Just hi <- hexDigit a
|
||||
, Just lo <- hexDigit b
|
||||
= fromIntegral (hi * 16 + lo) : go rest
|
||||
go (c:rest) = fromIntegral (fromEnum c) : go rest
|
||||
|
||||
hexDigit c
|
||||
| c >= '0' && c <= '9' = Just (fromEnum c - fromEnum '0')
|
||||
| c >= 'a' && c <= 'f' = Just (fromEnum c - fromEnum 'a' + 10)
|
||||
| c >= 'A' && c <= 'F' = Just (fromEnum c - fromEnum 'A' + 10)
|
||||
| otherwise = Nothing
|
||||
|
||||
pack = BS.pack
|
||||
lenientDecode = TE.lenientDecode
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Content patterns (must match the rules in Site.hs — sourced from
|
||||
-- Patterns.allContent so additions to the canonical list automatically
|
||||
|
|
|
|||
|
|
@ -13,14 +13,14 @@ module Compilers
|
|||
) where
|
||||
|
||||
import Hakyll
|
||||
import Text.Pandoc.Definition (Pandoc (..), Block (..),
|
||||
Inline (..))
|
||||
import Text.Pandoc.Definition (Pandoc (..), Block (..))
|
||||
import Text.Pandoc.Options (ReaderOptions (..), WriterOptions (..),
|
||||
HTMLMathMethod (..))
|
||||
import Text.Pandoc.Extensions (enableExtension, Extension (..))
|
||||
import qualified Data.Text as T
|
||||
import Data.Maybe (fromMaybe)
|
||||
import System.FilePath (takeDirectory)
|
||||
import Inlines (stringify)
|
||||
import Utils (wordCount, readingTime, escapeHtml)
|
||||
import Filters (applyAll, preprocessSource)
|
||||
import qualified Citations
|
||||
|
|
@ -49,33 +49,6 @@ writerOpts = defaultHakyllWriterOptions
|
|||
, writerTableOfContents = False
|
||||
}
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Inline stringification (local, avoids depending on Text.Pandoc.Shared)
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
||||
stringify :: [Inline] -> T.Text
|
||||
stringify = T.concat . map inlineToText
|
||||
where
|
||||
inlineToText (Str t) = t
|
||||
inlineToText Space = " "
|
||||
inlineToText SoftBreak = " "
|
||||
inlineToText LineBreak = " "
|
||||
inlineToText (Emph ils) = stringify ils
|
||||
inlineToText (Strong ils) = stringify ils
|
||||
inlineToText (Strikeout ils) = stringify ils
|
||||
inlineToText (Superscript ils) = stringify ils
|
||||
inlineToText (Subscript ils) = stringify ils
|
||||
inlineToText (SmallCaps ils) = stringify ils
|
||||
inlineToText (Quoted _ ils) = stringify ils
|
||||
inlineToText (Cite _ ils) = stringify ils
|
||||
inlineToText (Code _ t) = t
|
||||
inlineToText (RawInline _ t) = t
|
||||
inlineToText (Link _ ils _) = stringify ils
|
||||
inlineToText (Image _ ils _) = stringify ils
|
||||
inlineToText (Note _) = ""
|
||||
inlineToText (Span _ ils) = stringify ils
|
||||
inlineToText _ = ""
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- TOC extraction
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ import qualified Data.Aeson as Aeson
|
|||
import qualified Data.Aeson.Key as AK
|
||||
import qualified Data.Aeson.KeyMap as KM
|
||||
import qualified Data.Vector as V
|
||||
import Data.Char (toLower)
|
||||
import Data.List (intercalate, isPrefixOf)
|
||||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.Scientific as Sci
|
||||
|
|
@ -718,7 +719,7 @@ exifBackedField key = field key $ \item -> do
|
|||
-- to this lookup only when the author hasn't provided one.
|
||||
canonicalLicenseUrl :: String -> Maybe String
|
||||
canonicalLicenseUrl raw =
|
||||
case unwords (words (map (\c -> if c == '_' then ' ' else toLowerC c) raw)) of
|
||||
case unwords (words (map (\c -> if c == '_' then ' ' else toLower c) raw)) of
|
||||
"cc by 4.0" -> Just "https://creativecommons.org/licenses/by/4.0/"
|
||||
"cc by-sa 4.0" -> Just "https://creativecommons.org/licenses/by-sa/4.0/"
|
||||
"cc by-nc 4.0" -> Just "https://creativecommons.org/licenses/by-nc/4.0/"
|
||||
|
|
@ -729,10 +730,6 @@ canonicalLicenseUrl raw =
|
|||
"cc0 1.0" -> Just "https://creativecommons.org/publicdomain/zero/1.0/"
|
||||
"public domain" -> Just "https://creativecommons.org/publicdomain/mark/1.0/"
|
||||
_ -> Nothing
|
||||
where
|
||||
toLowerC c
|
||||
| c >= 'A' && c <= 'Z' = toEnum (fromEnum c + 32)
|
||||
| otherwise = c
|
||||
|
||||
-- | Context for photography pages and photo cards.
|
||||
--
|
||||
|
|
|
|||
|
|
@ -41,6 +41,7 @@ import qualified Text.Pandoc as Pandoc
|
|||
import Text.Pandoc.Walk (walkM)
|
||||
import System.Directory (doesFileExist)
|
||||
import System.FilePath (replaceExtension, takeExtension, (</>))
|
||||
import Inlines (stringify)
|
||||
import qualified Utils as U
|
||||
|
||||
-- | Apply image attribute injection and WebP wrapping to the entire document.
|
||||
|
|
@ -363,31 +364,6 @@ addAttr k v (i, cs, kvs)
|
|||
| any ((== k) . fst) kvs = (i, cs, kvs)
|
||||
| otherwise = (i, cs, (k, v) : kvs)
|
||||
|
||||
-- | Plain-text content of a list of inlines (for alt text).
|
||||
stringify :: [Inline] -> Text
|
||||
stringify = T.concat . map go
|
||||
where
|
||||
go (Str t) = t
|
||||
go Space = " "
|
||||
go SoftBreak = " "
|
||||
go LineBreak = " "
|
||||
go (Emph ils) = stringify ils
|
||||
go (Strong ils) = stringify ils
|
||||
go (Strikeout ils) = stringify ils
|
||||
go (Superscript ils) = stringify ils
|
||||
go (Subscript ils) = stringify ils
|
||||
go (SmallCaps ils) = stringify ils
|
||||
go (Underline ils) = stringify ils
|
||||
go (Quoted _ ils) = stringify ils
|
||||
go (Cite _ ils) = stringify ils
|
||||
go (Code _ t) = t
|
||||
go (Math _ t) = t
|
||||
go (RawInline _ _) = ""
|
||||
go (Link _ ils _) = stringify ils
|
||||
go (Image _ ils _) = stringify ils
|
||||
go (Span _ ils) = stringify ils
|
||||
go (Note _) = ""
|
||||
|
||||
-- | HTML-escape a text value for use in attribute values.
|
||||
-- Defers to the canonical 'Utils.escapeHtmlText'.
|
||||
esc :: Text -> Text
|
||||
|
|
|
|||
|
|
@ -0,0 +1,56 @@
|
|||
{-# LANGUAGE GHC2021 #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
-- | Shared Pandoc inline utilities.
|
||||
--
|
||||
-- 'stringify' collapses a list of 'Inline' nodes to plain 'Text', for use
|
||||
-- as alt text, plain-text excerpts for word-counting, comparison keys, etc.
|
||||
-- Keeping a single implementation prevents 'Compilers', 'Filters.Images',
|
||||
-- and any future caller from drifting on which inline node types they
|
||||
-- handle.
|
||||
--
|
||||
-- Mirrors the shape of @Text.Pandoc.Shared.stringify@ but is local so we
|
||||
-- don't need a @Text.Pandoc.Shared@ import from every Pandoc-handling
|
||||
-- module. It is a lossy conversion: structural elements (notes, raw HTML,
|
||||
-- images-without-alt) yield empty strings rather than placeholder text.
|
||||
module Inlines (stringify) where
|
||||
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Text.Pandoc.Definition
|
||||
|
||||
-- | Render a list of 'Inline' nodes to a plain 'Text' value.
|
||||
--
|
||||
-- The fallthrough cases are deliberately silent (@""@):
|
||||
--
|
||||
-- * 'Note' contents would balloon the result with footnote bodies.
|
||||
-- * 'RawInline' is format-specific (e.g. raw HTML) and would leak markup
|
||||
-- into contexts that expect plain text.
|
||||
-- * 'Image' alt-only fallback would otherwise render the alt text twice
|
||||
-- when the image is itself nested inside a link.
|
||||
--
|
||||
-- All recursing inline-container nodes (Emph, Strong, Link, Span, …) walk
|
||||
-- their children, so formatting and inline links contribute their
|
||||
-- visible text exactly once.
|
||||
stringify :: [Inline] -> Text
|
||||
stringify = T.concat . map go
|
||||
where
|
||||
go (Str t) = t
|
||||
go Space = " "
|
||||
go SoftBreak = " "
|
||||
go LineBreak = " "
|
||||
go (Emph ils) = stringify ils
|
||||
go (Strong ils) = stringify ils
|
||||
go (Strikeout ils) = stringify ils
|
||||
go (Superscript ils) = stringify ils
|
||||
go (Subscript ils) = stringify ils
|
||||
go (SmallCaps ils) = stringify ils
|
||||
go (Underline ils) = stringify ils
|
||||
go (Quoted _ ils) = stringify ils
|
||||
go (Cite _ ils) = stringify ils
|
||||
go (Code _ t) = t
|
||||
go (Math _ t) = t
|
||||
go (Link _ ils _) = stringify ils
|
||||
go (Image _ ils _) = stringify ils
|
||||
go (Span _ ils) = stringify ils
|
||||
go (RawInline _ _) = ""
|
||||
go (Note _) = ""
|
||||
|
|
@ -20,8 +20,25 @@
|
|||
-- * Contact sheet at @/photography/contact-sheet/@.
|
||||
-- * Atom feed at @/photography/feed.xml@.
|
||||
--
|
||||
-- See @PHOTOGRAPHY.md@ in the upstream levineuwirth.org repo for the
|
||||
-- full design rationale.
|
||||
-- Design rationale (recorded in this module's comments rather than a
|
||||
-- separate design doc):
|
||||
--
|
||||
-- * Content model — hybrid: flat singles + directory entries +
|
||||
-- series directories. Series detection is structural ('siblingIds'
|
||||
-- in 'photographyRules'); no @series: true@ frontmatter flag.
|
||||
-- * Metadata — frontmatter wins; auto-extracted EXIF + palette
|
||||
-- sidecars (from @tools/extract-exif.py@ /
|
||||
-- @tools/extract-palette.py@) fill in any gaps. See
|
||||
-- 'Contexts.photographyCtx'.
|
||||
-- * Privacy — every coordinate is rounded to the precision the
|
||||
-- author declares in @geo-precision:@ (default @city@) BEFORE it
|
||||
-- leaves this build step; @hidden@ omits the entry entirely.
|
||||
-- See 'roundCoord' / 'buildPin'.
|
||||
-- * Vendoring — Leaflet ships with the site (gated on the
|
||||
-- photography directory existing) so the map page makes no
|
||||
-- third-party requests at view time.
|
||||
-- * Originals — only ≤2400px web-optimized JPEGs are committed;
|
||||
-- RAWs / TIFFs / PSDs are refused via @.gitignore@ globs.
|
||||
module Photography
|
||||
( photographyRules
|
||||
) where
|
||||
|
|
|
|||
|
|
@ -18,14 +18,13 @@
|
|||
module SimilarLinks (similarLinksField) where
|
||||
|
||||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.Map.Strict as Map
|
||||
import Data.Map.Strict (Map)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as TE
|
||||
import qualified Data.Text.Encoding.Error as TE
|
||||
import qualified Data.Aeson as Aeson
|
||||
import Hakyll
|
||||
import Utils (percentDecode)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- JSON schema
|
||||
|
|
@ -73,6 +72,12 @@ similarLinksField = field "similar-links" $ \item -> do
|
|||
-- ---------------------------------------------------------------------------
|
||||
-- URL normalisation (mirrors embed.py's URL derivation)
|
||||
-- ---------------------------------------------------------------------------
|
||||
--
|
||||
-- Distinct from 'Utils.normaliseUrl' (used by Backlinks/Stats): that one
|
||||
-- strips @.html@ unconditionally, producing keys like @"/blog/index"@.
|
||||
-- 'embed.py' instead emits @"/blog/"@ for directory-style URLs, so we
|
||||
-- strip @"index.html"@ separately first to preserve the trailing slash.
|
||||
-- The percent-decoding step is shared (imported from 'Utils').
|
||||
|
||||
normaliseUrl :: String -> String
|
||||
normaliseUrl url =
|
||||
|
|
@ -87,28 +92,6 @@ normaliseUrl url =
|
|||
t4 = fromMaybe t3 (T.stripSuffix ".html" t3)
|
||||
in percentDecode (T.unpack t4)
|
||||
|
||||
-- | Percent-decode @%XX@ escapes (UTF-8) so percent-encoded paths
|
||||
-- collide with their decoded form on map lookup. Mirrors
|
||||
-- 'Backlinks.percentDecode'; the two implementations are intentionally
|
||||
-- duplicated because they apply different normalisations *before*
|
||||
-- decoding (Backlinks strips @.html@ unconditionally; SimilarLinks
|
||||
-- preserves the trailing-slash form for index pages).
|
||||
percentDecode :: String -> String
|
||||
percentDecode = T.unpack . TE.decodeUtf8With TE.lenientDecode . BS.pack . go
|
||||
where
|
||||
go [] = []
|
||||
go ('%':a:b:rest)
|
||||
| Just hi <- hexDigit a
|
||||
, Just lo <- hexDigit b
|
||||
= fromIntegral (hi * 16 + lo) : go rest
|
||||
go (c:rest) = fromIntegral (fromEnum c) : go rest
|
||||
|
||||
hexDigit c
|
||||
| c >= '0' && c <= '9' = Just (fromEnum c - fromEnum '0')
|
||||
| c >= 'a' && c <= 'f' = Just (fromEnum c - fromEnum 'a' + 10)
|
||||
| c >= 'A' && c <= 'F' = Just (fromEnum c - fromEnum 'A' + 10)
|
||||
| otherwise = Nothing
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- HTML rendering
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ module Site (rules) where
|
|||
import Control.Monad (filterM, when)
|
||||
import Data.List (isPrefixOf)
|
||||
import Data.Maybe (fromMaybe)
|
||||
import System.Directory (doesDirectoryExist)
|
||||
import System.Directory (doesDirectoryExist, doesFileExist)
|
||||
import System.Environment (lookupEnv)
|
||||
import System.FilePath (takeDirectory, takeFileName, replaceExtension)
|
||||
import qualified Data.Aeson as Aeson
|
||||
|
|
@ -134,7 +134,12 @@ rules = do
|
|||
match "data/similar-links.json" $ compile getResourceBody
|
||||
|
||||
-- Commonplace YAML — compiled as a raw string so it can be loaded
|
||||
-- with dependency tracking by the commonplace page compiler.
|
||||
-- with dependency tracking by the commonplace page compiler. The
|
||||
-- page rule below is gated on this file existing; without that
|
||||
-- gate, a content/commonplace.md created without its data/ yaml
|
||||
-- companion would crash the build with an unhelpful "no item" error.
|
||||
hasCommonplace <- preprocess $ doesFileExist "data/commonplace.yaml"
|
||||
when hasCommonplace $
|
||||
match "data/commonplace.yaml" $ compile getResourceBody
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
|
@ -148,9 +153,13 @@ rules = do
|
|||
>>= relativizeUrls
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Commonplace book
|
||||
-- Commonplace book — opt-in. Active only when BOTH the page source
|
||||
-- (content/commonplace.md) AND the data file (data/commonplace.yaml)
|
||||
-- exist. Either alone is treated as "not enabled" — without the yaml,
|
||||
-- 'Commonplace.loadCommonplace' would 'load' a nonexistent item and
|
||||
-- crash the build; without the page, the data is unused.
|
||||
-- ---------------------------------------------------------------------------
|
||||
match "content/commonplace.md" $ do
|
||||
when hasCommonplace $ match "content/commonplace.md" $ do
|
||||
route $ constRoute "commonplace.html"
|
||||
compile $ pageCompiler
|
||||
>>= loadAndApplyTemplate "templates/commonplace.html" commonplaceCtx
|
||||
|
|
@ -164,10 +173,19 @@ rules = do
|
|||
>>= loadAndApplyTemplate "templates/default.html" essayCtx
|
||||
>>= relativizeUrls
|
||||
|
||||
match ("content/*.md"
|
||||
-- All other top-level standalone pages. The commonplace complement is
|
||||
-- conditional: when the commonplace book is enabled, the dedicated
|
||||
-- rule above owns commonplace.md and we exclude it here; when the
|
||||
-- yaml is missing, we drop the complement so a user-authored
|
||||
-- commonplace.md still renders as a regular page rather than being
|
||||
-- silently dropped.
|
||||
let standalonePagesBase = "content/*.md"
|
||||
.&&. complement "content/index.md"
|
||||
.&&. complement "content/commonplace.md"
|
||||
.&&. complement "content/colophon.md") $ do
|
||||
.&&. complement "content/colophon.md"
|
||||
standalonePagesPattern = if hasCommonplace
|
||||
then standalonePagesBase .&&. complement "content/commonplace.md"
|
||||
else standalonePagesBase
|
||||
match standalonePagesPattern $ do
|
||||
route $ gsubRoute "content/" (const "")
|
||||
`composeRoutes` setExtension "html"
|
||||
compile $ pageCompiler
|
||||
|
|
|
|||
|
|
@ -38,7 +38,7 @@ import qualified Text.Blaze.Internal as BI
|
|||
import Hakyll
|
||||
import Contexts (siteCtx, authorLinksField)
|
||||
import qualified Patterns as P
|
||||
import Utils (readingTime)
|
||||
import Utils (readingTime, normaliseUrl)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- Types
|
||||
|
|
@ -152,12 +152,6 @@ stripHtmlTags = go
|
|||
skipApos (_:rs) = skipApos rs
|
||||
skipApos [] = []
|
||||
|
||||
-- | Normalise a page URL for backlink map lookup (strip trailing .html).
|
||||
normUrl :: String -> String
|
||||
normUrl u
|
||||
| ".html" `isSuffixOf` u = take (length u - 5) u
|
||||
| otherwise = u
|
||||
|
||||
pad2 :: (Show a, Integral a) => a -> String
|
||||
pad2 n = if n < 10 then "0" ++ show n else show n
|
||||
|
||||
|
|
@ -814,10 +808,10 @@ statsRules tags = do
|
|||
blSet = Set.fromList (map fst blPairs)
|
||||
orphanCount = length
|
||||
[ p | p <- allPIs
|
||||
, not (Set.member (normUrl (piUrl p)) blSet) ]
|
||||
, not (Set.member (normaliseUrl (piUrl p)) blSet) ]
|
||||
mostLinked = listToMaybe (sortBy (comparing (Down . snd)) blPairs)
|
||||
mostLinkedInfo = mostLinked >>= \(url, ct) ->
|
||||
let mTitle = piTitle <$> find (\p -> normUrl (piUrl p) == url) allPIs
|
||||
let mTitle = piTitle <$> find (\p -> normaliseUrl (piUrl p) == url) allPIs
|
||||
in Just (url, ct, fromMaybe url mTitle)
|
||||
|
||||
-- ----------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -42,25 +42,43 @@ expandTag t =
|
|||
let segs = wordsBy (== '/') t
|
||||
in [ intercalate "/" (take n segs) | n <- [1 .. length segs] ]
|
||||
|
||||
-- | Top-level tags that own a section URL outside the tag system, and
|
||||
-- therefore must NOT be created as tag pages — doing so would
|
||||
-- collide with a section landing route. The literal @"photography"@
|
||||
-- is the only one currently affected: every photo's @tags:@ list
|
||||
-- begins with the bare @"photography"@ portal tag (per the section's
|
||||
-- convention), and 'tagIdentifier' would route that to
|
||||
-- @"photography/index.html"@ — already owned by
|
||||
-- @photographyLandingRules@.
|
||||
-- | Top-level tags whose @<tag>/index.html@ route would collide with a
|
||||
-- section landing page created by another rule. 'tagIdentifier' routes
|
||||
-- every tag to @"<tag>/index.html"@, so any directory-style URL owned
|
||||
-- elsewhere in the site is reserved.
|
||||
--
|
||||
-- Sub-tags (@photography/landscape@, @photography/film@, …) are
|
||||
-- unaffected; they keep their tag pages because no section landing
|
||||
-- claims those URLs.
|
||||
-- The reserved set:
|
||||
--
|
||||
-- Other portal tags (@music@, @poetry@, @fiction@, …) don't appear
|
||||
-- here because their content types don't currently feed
|
||||
-- 'tagIndexable', so the top-level tag never enters the tag system.
|
||||
-- Add to this set if that ever changes.
|
||||
-- * @"photography"@ — owned by 'Photography.photographyLandingRules'.
|
||||
-- Every photo's @tags:@ list begins with the bare @"photography"@
|
||||
-- portal tag; without this exclusion, the tag system would clobber
|
||||
-- the section landing.
|
||||
-- * @"blog"@ — owned by 'Pagination.blogPaginateRules'
|
||||
-- (@/blog/index.html@ + @/blog/page/N/@).
|
||||
-- * @"essays"@ — owned by the @essays/index.html@ create rule
|
||||
-- in 'Site.rules'.
|
||||
-- * @"music"@ — owned by the music catalog at @/music/index.html@.
|
||||
-- * @"authors"@ — owned by 'Authors.applyAuthorRules'
|
||||
-- (every author lives at @/authors/<slug>/@).
|
||||
-- * @"build"@ — owned by 'Stats.statsRules' (@/build/index.html@).
|
||||
-- * @"stats"@ — owned by 'Stats.statsRules' (@/stats/index.html@).
|
||||
--
|
||||
-- Sub-tags like @photography/landscape@ or @blog/announcements@ are
|
||||
-- unaffected; their routes don't collide with any section landing.
|
||||
--
|
||||
-- If you add a new section that owns a single top-level URL segment,
|
||||
-- add the segment here so a content tag of the same name doesn't
|
||||
-- silently shadow it.
|
||||
sectionOwnedTopLevelTags :: [String]
|
||||
sectionOwnedTopLevelTags = ["photography"]
|
||||
sectionOwnedTopLevelTags =
|
||||
[ "photography"
|
||||
, "blog"
|
||||
, "essays"
|
||||
, "music"
|
||||
, "authors"
|
||||
, "build"
|
||||
, "stats"
|
||||
]
|
||||
|
||||
-- | All expanded tags for an item (reads the "tags" metadata field).
|
||||
-- Filters out any 'sectionOwnedTopLevelTags' to prevent route
|
||||
|
|
|
|||
|
|
@ -16,10 +16,16 @@ module Utils
|
|||
, trim
|
||||
, authorSlugify
|
||||
, authorNameOf
|
||||
, normaliseUrl
|
||||
, percentDecode
|
||||
) where
|
||||
|
||||
import Data.Char (isAlphaNum, isSpace, toLower)
|
||||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as TE
|
||||
import qualified Data.Text.Encoding.Error as TE
|
||||
|
||||
-- | Count the number of words in a string (split on whitespace).
|
||||
wordCount :: String -> Int
|
||||
|
|
@ -76,3 +82,52 @@ authorSlugify = map (\c -> if c == ' ' then '-' else c)
|
|||
-- which routes everything through @/authors/{slug}/@).
|
||||
authorNameOf :: String -> String
|
||||
authorNameOf s = trim (takeWhile (/= '|') s)
|
||||
|
||||
-- ---------------------------------------------------------------------------
|
||||
-- URL normalisation
|
||||
-- ---------------------------------------------------------------------------
|
||||
|
||||
-- | Normalise an internal URL as a stable map key:
|
||||
--
|
||||
-- * strip everything from a @?@ or @#@ onward,
|
||||
-- * ensure a leading @/@,
|
||||
-- * strip a trailing @.html@ extension,
|
||||
-- * percent-decode the path so @/essays/caf%C3%A9@ and
|
||||
-- @/essays/café@ collide on the same key.
|
||||
--
|
||||
-- This is the canonical normaliser used by 'Backlinks' (writing keys into
|
||||
-- @data/backlinks.json@) and 'Stats' (looking up those keys for orphan
|
||||
-- counting and most-linked detection). Keeping a single implementation here
|
||||
-- prevents the two surfaces from drifting on percent-encoding or fragment
|
||||
-- handling.
|
||||
--
|
||||
-- 'SimilarLinks' uses a slightly different normalisation that preserves
|
||||
-- trailing slashes on directory-style URLs (because @embed.py@ produces
|
||||
-- keys like @/blog/@ rather than @/blog/index@); see its own
|
||||
-- @normaliseUrl@ for that variant.
|
||||
normaliseUrl :: String -> String
|
||||
normaliseUrl url =
|
||||
let t = T.pack url
|
||||
t1 = fst (T.breakOn "?" (fst (T.breakOn "#" t)))
|
||||
t2 = if T.isPrefixOf "/" t1 then t1 else "/" `T.append` t1
|
||||
t3 = fromMaybe t2 (T.stripSuffix ".html" t2)
|
||||
in percentDecode (T.unpack t3)
|
||||
|
||||
-- | Decode percent-escapes (@%XX@) into raw bytes, then re-interpret the
|
||||
-- resulting bytestring as UTF-8. Invalid escapes are passed through
|
||||
-- verbatim so this is safe to call on already-decoded input.
|
||||
percentDecode :: String -> String
|
||||
percentDecode = T.unpack . TE.decodeUtf8With TE.lenientDecode . BS.pack . go
|
||||
where
|
||||
go [] = []
|
||||
go ('%':a:b:rest)
|
||||
| Just hi <- hexDigit a
|
||||
, Just lo <- hexDigit b
|
||||
= fromIntegral (hi * 16 + lo) : go rest
|
||||
go (c:rest) = fromIntegral (fromEnum c) : go rest
|
||||
|
||||
hexDigit c
|
||||
| c >= '0' && c <= '9' = Just (fromEnum c - fromEnum '0')
|
||||
| c >= 'a' && c <= 'f' = Just (fromEnum c - fromEnum 'a' + 10)
|
||||
| c >= 'A' && c <= 'F' = Just (fromEnum c - fromEnum 'A' + 10)
|
||||
| otherwise = Nothing
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ constraints: any.Glob ==0.10.2,
|
|||
any.Only ==0.1,
|
||||
any.QuickCheck ==2.15.0.1,
|
||||
any.StateVar ==1.2.2,
|
||||
any.aeson ==2.2.1.0,
|
||||
any.aeson ==2.2.2.0,
|
||||
any.aeson-pretty ==0.8.10,
|
||||
any.ansi-terminal ==1.1,
|
||||
any.ansi-terminal-types ==1.1,
|
||||
|
|
@ -40,6 +40,7 @@ constraints: any.Glob ==0.10.2,
|
|||
any.cassava ==0.5.4.1,
|
||||
any.cborg ==0.2.10.0,
|
||||
any.cereal ==0.5.8.3,
|
||||
any.character-ps ==0.1,
|
||||
any.citeproc ==0.8.1.1,
|
||||
any.colour ==2.3.7,
|
||||
any.commonmark ==0.2.6.1,
|
||||
|
|
@ -143,7 +144,7 @@ constraints: any.Glob ==0.10.2,
|
|||
any.safe ==0.3.21,
|
||||
any.safe-exceptions ==0.1.7.4,
|
||||
any.scientific ==0.3.8.1,
|
||||
any.semialign ==1.3.1,
|
||||
any.semialign ==1.3.1.1,
|
||||
any.semigroupoids ==6.0.2,
|
||||
any.serialise ==0.2.6.1,
|
||||
any.simple-sendfile ==0.2.32,
|
||||
|
|
@ -171,7 +172,7 @@ constraints: any.Glob ==0.10.2,
|
|||
any.text-short ==0.1.6.1,
|
||||
any.th-abstraction ==0.6.0.0,
|
||||
any.th-compat ==0.1.7,
|
||||
any.th-lift ==0.8.6,
|
||||
any.th-lift ==0.8.7,
|
||||
any.th-lift-instances ==0.1.20,
|
||||
any.these ==1.2.1,
|
||||
any.time ==1.12.2,
|
||||
|
|
|
|||
|
|
@ -15,3 +15,17 @@ This site is built with Ozymandias, a static site framework using Hakyll, Pandoc
|
|||
- **Score reader:** Inline SVG score rendering for music compositions.
|
||||
|
||||
Configuration lives in `site.yaml` at the project root.
|
||||
|
||||
## Living documents
|
||||
|
||||
Every essay carries a small *Stability* signal in its metadata block. It answers the question "how settled is this piece?", and is computed from the page's git history rather than declared by hand:
|
||||
|
||||
- **volatile** — solo commit, or less than two weeks old.
|
||||
- **revising** — under six commits and under three months old.
|
||||
- **fairly stable** — under sixteen commits, or under one year old.
|
||||
- **stable** — under thirty-one commits, or under two years old.
|
||||
- **established** — anything beyond.
|
||||
|
||||
The classification is intentionally conservative: a fast burst of revisions early in a piece's life reads as *volatile* until enough calendar time has passed to demonstrate the text has settled. The *Last reviewed* date in the page footer is the most recent commit touching the file.
|
||||
|
||||
You can pin a page's stability by adding its source path to `IGNORE.txt` in the project root; the file is cleared after every successful build, so pins are one-shot. Authors may also override the auto-calculation with a `stability:` and `last-reviewed:` value in frontmatter.
|
||||
|
|
|
|||
|
|
@ -10,5 +10,6 @@ Some places to start:
|
|||
- The [feature tour](/essays/feature-tour.html) demonstrates typography, sidenotes, math, code, citations, and the epistemic profile.
|
||||
- The [library](/library.html) groups all content by portal.
|
||||
- Browse [poetry](/poetry/), [fiction](/fiction/), [blog posts](/blog/), or [compositions](/music/).
|
||||
- Feeling lucky? Jump to a <a href="#" data-random>random page</a> (handled by `static/js/random.js`, which `default.html` only loads on the home page; any element with `data-random` or the id `#random-page-btn` opts in).
|
||||
|
||||
Edit `site.yaml` to set your name, URL, navigation links, and portal taxonomy.
|
||||
|
|
|
|||
|
|
@ -4,8 +4,8 @@ version: 0.1.0.0
|
|||
synopsis: Static site builder for the Ozymandias template
|
||||
license: MIT
|
||||
license-file: LICENSE
|
||||
author: Your Name
|
||||
maintainer: you@example.com
|
||||
author: Levi Neuwirth
|
||||
maintainer: ln@levineuwirth.org
|
||||
build-type: Simple
|
||||
|
||||
executable site
|
||||
|
|
@ -42,11 +42,12 @@ executable site
|
|||
Filters.Images
|
||||
Filters.Score
|
||||
Filters.Viz
|
||||
Inlines
|
||||
Utils
|
||||
build-depends:
|
||||
base >= 4.18 && < 5,
|
||||
hakyll >= 4.16 && < 4.17,
|
||||
pandoc >= 3.1 && < 3.7,
|
||||
pandoc >= 3.1 && < 3.8,
|
||||
pandoc-types >= 1.23 && < 1.24,
|
||||
text >= 2.0 && < 2.2,
|
||||
containers >= 0.6 && < 0.8,
|
||||
|
|
|
|||
|
|
@ -1,86 +0,0 @@
|
|||
/* citations.js — hover tooltip for inline citation markers.
|
||||
On hover of a .cite-marker, reads the matching bibliography entry from
|
||||
the DOM and shows it in a floating tooltip. On click, follows the href
|
||||
to jump to the bibliography section. Phase 3 popups.js can supersede this. */
|
||||
|
||||
(function () {
|
||||
'use strict';
|
||||
|
||||
let activeTooltip = null;
|
||||
let hideTimer = null;
|
||||
|
||||
function makeTooltip(html) {
|
||||
const el = document.createElement('div');
|
||||
el.className = 'cite-tooltip';
|
||||
el.innerHTML = html;
|
||||
el.addEventListener('mouseenter', () => clearTimeout(hideTimer));
|
||||
el.addEventListener('mouseleave', scheduleHide);
|
||||
return el;
|
||||
}
|
||||
|
||||
function positionTooltip(tooltip, anchor) {
|
||||
document.body.appendChild(tooltip);
|
||||
const aRect = anchor.getBoundingClientRect();
|
||||
const tRect = tooltip.getBoundingClientRect();
|
||||
|
||||
let left = aRect.left + window.scrollX;
|
||||
let top = aRect.top + window.scrollY - tRect.height - 10;
|
||||
|
||||
// Keep horizontally within viewport with margin
|
||||
const maxLeft = window.innerWidth - tRect.width - 12;
|
||||
left = Math.max(8, Math.min(left, maxLeft));
|
||||
|
||||
// Flip below anchor if not enough room above
|
||||
if (top < window.scrollY + 8) {
|
||||
top = aRect.bottom + window.scrollY + 10;
|
||||
}
|
||||
|
||||
tooltip.style.left = left + 'px';
|
||||
tooltip.style.top = top + 'px';
|
||||
}
|
||||
|
||||
function scheduleHide() {
|
||||
hideTimer = setTimeout(() => {
|
||||
if (activeTooltip) {
|
||||
activeTooltip.remove();
|
||||
activeTooltip = null;
|
||||
}
|
||||
}, 180);
|
||||
}
|
||||
|
||||
function getRefHtml(refEl) {
|
||||
// Strip the [N] number span, return the remaining innerHTML
|
||||
const clone = refEl.cloneNode(true);
|
||||
const num = clone.querySelector('.ref-num');
|
||||
if (num) num.remove();
|
||||
return clone.innerHTML.trim();
|
||||
}
|
||||
|
||||
function init() {
|
||||
document.querySelectorAll('.cite-marker').forEach(marker => {
|
||||
const link = marker.querySelector('a.cite-link');
|
||||
if (!link) return;
|
||||
|
||||
const href = link.getAttribute('href');
|
||||
if (!href || !href.startsWith('#')) return;
|
||||
|
||||
const refEl = document.getElementById(href.slice(1));
|
||||
if (!refEl) return;
|
||||
|
||||
marker.addEventListener('mouseenter', () => {
|
||||
clearTimeout(hideTimer);
|
||||
if (activeTooltip) { activeTooltip.remove(); }
|
||||
activeTooltip = makeTooltip(getRefHtml(refEl));
|
||||
positionTooltip(activeTooltip, marker);
|
||||
});
|
||||
|
||||
marker.addEventListener('mouseleave', scheduleHide);
|
||||
});
|
||||
}
|
||||
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', init);
|
||||
} else {
|
||||
init();
|
||||
}
|
||||
})();
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<html lang="$site-language$">
|
||||
<head>
|
||||
$partial("templates/partials/head.html")$
|
||||
</head>
|
||||
|
|
@ -32,6 +32,7 @@ $if(photography-map)$<script src="/leaflet/leaflet.markercluster.js" defer></scr
|
|||
$if(photography-map)$<script src="/js/photography-map.js" defer></script>$endif$
|
||||
$for(page-scripts)$<script src="/$script-src$" defer></script>$endfor$
|
||||
$if(math)$
|
||||
<!-- KaTeX JS — see the rationale comment by the matching CSS load in head.html. -->
|
||||
<script defer src="https://cdn.jsdelivr.net/npm/katex@0.16.11/dist/katex.min.js"></script>
|
||||
<script defer src="/js/katex-bootstrap.js"></script>
|
||||
$endif$
|
||||
|
|
|
|||
|
|
@ -26,6 +26,12 @@ $if(photography-map)$<link rel="stylesheet" href="/leaflet/MarkerCluster.css">$e
|
|||
$if(photography-map)$<link rel="stylesheet" href="/leaflet/MarkerCluster.Default.css">$endif$
|
||||
<link rel="stylesheet" href="/css/print.css" media="print">
|
||||
$if(math)$
|
||||
<!-- KaTeX CSS+fonts (~280 KB) is loaded from jsDelivr rather than vendored.
|
||||
The framework vendors what is privacy/CSP-critical (Leaflet for the
|
||||
photography map, the ML model for client-side semantic search); KaTeX
|
||||
and Vega below are loaded only on pages that opt into them via
|
||||
frontmatter, so the third-party request is bounded and predictable.
|
||||
Swap the URL for a local /katex/ path if your CSP requires self-only. -->
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/katex@0.16.11/dist/katex.min.css">
|
||||
$endif$
|
||||
$if(search)$
|
||||
|
|
@ -34,6 +40,11 @@ $endif$
|
|||
<script src="/js/utils.js"></script>
|
||||
<script src="/js/theme.js"></script>
|
||||
$if(viz)$
|
||||
<!-- Vega + Vega-Lite + Vega-Embed (~1.5 MB total) is loaded from jsDelivr
|
||||
rather than vendored. Same rationale as KaTeX above: opt-in per page
|
||||
via `viz: true` frontmatter, so the request only happens on pages that
|
||||
actually render data visualisations. Swap for local /vega/ paths if
|
||||
your CSP requires self-only. -->
|
||||
<link rel="stylesheet" href="/css/viz.css">
|
||||
<script src="https://cdn.jsdelivr.net/npm/vega@5" defer></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/vega-lite@5" defer></script>
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@
|
|||
</div>
|
||||
$endif$
|
||||
<div class="meta-row meta-authors">
|
||||
<span class="meta-label">by</span>$if(poet)$$poet$$else$$for(author-links)$<a href="$author-url$">$author-name$</a>$sep$, $endfor$$endif$
|
||||
<span class="meta-label">by</span>$for(author-links)$<a href="$author-url$">$author-name$</a>$sep$, $endfor$
|
||||
</div>
|
||||
$if(affiliation-links)$
|
||||
<div class="meta-row meta-affiliation">
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<html lang="$site-language$">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
|
|
|
|||
|
|
@ -67,6 +67,9 @@ verify_or_warn() {
|
|||
echo "leaflet: sha256 mismatch for $pin_key" >&2
|
||||
echo " expected $want" >&2
|
||||
echo " got $got" >&2
|
||||
# Remove the bad download so a retry doesn't see a stale file and
|
||||
# think it's already present. Matches download-model.sh's behavior.
|
||||
rm -f "$file"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,74 @@
|
|||
#!/usr/bin/env bash
|
||||
# download-pdfjs.sh — Vendor Mozilla's prebuilt PDF.js viewer into static/pdfjs/.
|
||||
#
|
||||
# The Haskell link filter (build/Filters/Links.hs) rewrites every root-relative
|
||||
# .pdf link to open through /pdfjs/web/viewer.html, so this viewer must be
|
||||
# present in static/ for the site build to produce working PDF links.
|
||||
#
|
||||
# Run once before deploying. The extracted viewer is gitignored (~18 MB
|
||||
# uncompressed); re-running is safe — the script skips when the viewer
|
||||
# already exists.
|
||||
#
|
||||
# To bump the pinned version, set PDFJS_VERSION, re-run, then update
|
||||
# tools/pdfjs-checksums.sha256 with the new archive SHA-256.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
PDFJS_DIR="$REPO_ROOT/static/pdfjs"
|
||||
CHECKSUMS="$REPO_ROOT/tools/pdfjs-checksums.sha256"
|
||||
|
||||
PDFJS_VERSION="${PDFJS_VERSION:-5.6.205}"
|
||||
ARCHIVE="pdfjs-${PDFJS_VERSION}-dist.zip"
|
||||
URL="https://github.com/mozilla/pdf.js/releases/download/v${PDFJS_VERSION}/${ARCHIVE}"
|
||||
|
||||
if [ -f "$PDFJS_DIR/web/viewer.html" ]; then
|
||||
echo "pdfjs: already present at $PDFJS_DIR/web/viewer.html (skipping)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
command -v unzip >/dev/null 2>&1 || {
|
||||
echo "download-pdfjs: unzip not found — install it (pacman -S unzip / apt install unzip)" >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
tmpdir=$(mktemp -d)
|
||||
trap 'rm -rf "$tmpdir"' EXIT
|
||||
|
||||
echo "pdfjs: downloading $ARCHIVE"
|
||||
curl -fsSL --progress-bar "$URL" -o "$tmpdir/$ARCHIVE"
|
||||
|
||||
if [ -f "$CHECKSUMS" ]; then
|
||||
want=$(awk -v p="$ARCHIVE" '$2 == p { print $1; exit }' "$CHECKSUMS")
|
||||
if [ -n "$want" ]; then
|
||||
got=$(sha256sum "$tmpdir/$ARCHIVE" | awk '{ print $1 }')
|
||||
if [ "$got" != "$want" ]; then
|
||||
echo "pdfjs: sha256 mismatch for $ARCHIVE" >&2
|
||||
echo " expected $want" >&2
|
||||
echo " got $got" >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "pdfjs: sha256 verified"
|
||||
else
|
||||
echo "pdfjs: no pinned checksum for $ARCHIVE in $CHECKSUMS — skipping verification" >&2
|
||||
fi
|
||||
else
|
||||
echo "pdfjs: $CHECKSUMS not found — skipping sha256 verification" >&2
|
||||
fi
|
||||
|
||||
mkdir -p "$PDFJS_DIR"
|
||||
echo "pdfjs: extracting to $PDFJS_DIR"
|
||||
unzip -q -o "$tmpdir/$ARCHIVE" -d "$PDFJS_DIR"
|
||||
|
||||
# Strip artifacts that are never needed by site users. Saves ~11 MB on
|
||||
# disk and in rsync; none are referenced by viewer.html at runtime.
|
||||
# *.map sourcemaps (devtools-only)
|
||||
# web/debugger.mjs, debugger.css PDF.js developer panel
|
||||
# web/compressed.tracemonkey-*.pdf demo PDF shipped as the viewer's default
|
||||
echo "pdfjs: stripping unused artifacts"
|
||||
find "$PDFJS_DIR" -type f -name '*.map' -delete
|
||||
rm -f "$PDFJS_DIR/web/debugger.mjs" "$PDFJS_DIR/web/debugger.css"
|
||||
rm -f "$PDFJS_DIR"/web/compressed.tracemonkey-*.pdf
|
||||
|
||||
echo "pdfjs: done. static/pdfjs/web/viewer.html is ready."
|
||||
echo " Run 'make build' to include it in _site/."
|
||||
|
|
@ -43,7 +43,23 @@ MIN_SCORE = 0.30 # similar-links: discard weak matches
|
|||
MIN_PARA_CHARS = 80 # semantic: skip very short paragraphs
|
||||
MAX_PARA_CHARS = 1000 # semantic: truncate before embedding
|
||||
|
||||
EXCLUDE_URLS = {"/search/", "/build/", "/404.html", "/feed.xml", "/music/feed.xml"}
|
||||
# Pages that should not appear in similar-links suggestions or the semantic
|
||||
# index. Search/build/stats are meta-pages with no prose; feeds are XML.
|
||||
# Photography listing surfaces (map/contact-sheet/by-year) are index pages,
|
||||
# not content. URLs must match exactly what `_url_from_path` produces (i.e.,
|
||||
# directory-style URLs end with `/`, file-style URLs include the extension).
|
||||
EXCLUDE_URLS = {
|
||||
"/search.html",
|
||||
"/build/",
|
||||
"/stats/",
|
||||
"/library.html",
|
||||
"/new.html",
|
||||
"/feed.xml",
|
||||
"/music/feed.xml",
|
||||
"/photography/feed.xml",
|
||||
"/photography/map/",
|
||||
"/photography/contact-sheet/",
|
||||
}
|
||||
|
||||
STRIP_SELECTORS = [
|
||||
"nav", "footer", "#toc", ".link-popup", "script", "style",
|
||||
|
|
@ -88,61 +104,66 @@ def _title(soup: BeautifulSoup, url: str) -> str:
|
|||
return re.split(r"\s+[—–-]\s+", raw)[0].strip()
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Page-level extraction (for similar-links)
|
||||
# Extraction
|
||||
# ---------------------------------------------------------------------------
|
||||
#
|
||||
# A single pass over each HTML file produces both:
|
||||
#
|
||||
# * a page-level record (concatenated body text, for similar-links)
|
||||
# * a list of paragraph-level records (for the semantic index)
|
||||
#
|
||||
# Both surfaces want the same soup; an earlier version of this script
|
||||
# parsed each file twice. The combined pass keeps BeautifulSoup work to
|
||||
# one allocation per file.
|
||||
|
||||
def extract_one(html_path: Path) -> tuple[dict | None, list[dict]]:
|
||||
"""Parse one HTML file and return (page-record-or-None, paragraph-list).
|
||||
|
||||
Returns ``(None, [])`` when the URL is excluded, when the file has no
|
||||
``#markdownBody`` (so it isn't a content page), or when the body text
|
||||
is too short to be meaningful.
|
||||
"""
|
||||
url = _url_from_path(html_path)
|
||||
if url in EXCLUDE_URLS:
|
||||
return None, []
|
||||
|
||||
def extract_page(html_path: Path) -> dict | None:
|
||||
raw = html_path.read_text(encoding="utf-8", errors="replace")
|
||||
soup = BeautifulSoup(raw, "html.parser")
|
||||
url = _url_from_path(html_path)
|
||||
|
||||
if url in EXCLUDE_URLS:
|
||||
return None
|
||||
body = soup.select_one("#markdownBody")
|
||||
if body is None:
|
||||
return None
|
||||
return None, []
|
||||
|
||||
title = _title(soup, url)
|
||||
# _clean_soup mutates the tree, so it must run AFTER we've captured
|
||||
# the title (selectors like h1 may live inside #markdownBody on some
|
||||
# layouts) and BEFORE we read body text for both surfaces.
|
||||
_clean_soup(soup)
|
||||
|
||||
# Page-level record.
|
||||
text = re.sub(r"\s+", " ", body.get_text(" ", strip=True)).strip()
|
||||
if len(text) < 100:
|
||||
return None
|
||||
|
||||
return {"url": url, "title": title, "text": text}
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Paragraph-level extraction (for semantic search)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def extract_paragraphs(html_path: Path, url: str, title: str) -> list[dict]:
|
||||
raw = html_path.read_text(encoding="utf-8", errors="replace")
|
||||
soup = BeautifulSoup(raw, "html.parser")
|
||||
body = soup.select_one("#markdownBody")
|
||||
if body is None:
|
||||
return []
|
||||
|
||||
_clean_soup(soup)
|
||||
|
||||
paras = []
|
||||
heading = title # track current section heading
|
||||
page = None if len(text) < 100 else {
|
||||
"url": url, "title": title, "text": text,
|
||||
}
|
||||
|
||||
# Paragraph-level records — re-traverse the same (now-cleaned) body.
|
||||
paras: list[dict] = []
|
||||
heading = title
|
||||
for el in body.find_all(["h1", "h2", "h3", "h4", "p", "li", "blockquote"]):
|
||||
if el.name in ("h1", "h2", "h3", "h4"):
|
||||
heading = el.get_text(" ", strip=True)
|
||||
continue
|
||||
text = re.sub(r"\s+", " ", el.get_text(" ", strip=True)).strip()
|
||||
if len(text) < MIN_PARA_CHARS:
|
||||
para_text = re.sub(r"\s+", " ", el.get_text(" ", strip=True)).strip()
|
||||
if len(para_text) < MIN_PARA_CHARS:
|
||||
continue
|
||||
paras.append({
|
||||
"url": url,
|
||||
"title": title,
|
||||
"heading": heading,
|
||||
"excerpt": text[:200] + ("…" if len(text) > 200 else ""),
|
||||
"text": text[:MAX_PARA_CHARS],
|
||||
"excerpt": para_text[:200] + ("…" if len(para_text) > 200 else ""),
|
||||
"text": para_text[:MAX_PARA_CHARS],
|
||||
})
|
||||
|
||||
return paras
|
||||
return page, paras
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Main
|
||||
|
|
@ -157,17 +178,17 @@ def main() -> int:
|
|||
print("embed.py: all outputs up to date — skipping")
|
||||
return 0
|
||||
|
||||
# --- Extract pages + paragraphs in one pass ---
|
||||
# --- Extract pages + paragraphs in a single soup-per-file pass ---
|
||||
print("embed.py: extracting pages…")
|
||||
pages = []
|
||||
paragraphs = []
|
||||
|
||||
for html in sorted(SITE_DIR.rglob("*.html")):
|
||||
page = extract_page(html)
|
||||
page, paras = extract_one(html)
|
||||
if page is None:
|
||||
continue
|
||||
pages.append(page)
|
||||
paragraphs.extend(extract_paragraphs(html, page["url"], page["title"]))
|
||||
paragraphs.extend(paras)
|
||||
|
||||
if not pages:
|
||||
print("embed.py: no indexable pages found", file=sys.stderr)
|
||||
|
|
|
|||
|
|
@ -35,6 +35,7 @@ images are logged and the rest of the walk continues.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import shutil
|
||||
import subprocess
|
||||
|
|
@ -57,9 +58,15 @@ IMAGE_EXTS = {".jpg", ".jpeg", ".png"}
|
|||
# Mapping from EXIF field names (as exposed by exiftool / Pillow) to the
|
||||
# sidecar keys consumed by Hakyll. Hakyll's fields are deliberately
|
||||
# lowercase-with-hyphens, matching the photographyCtx convention.
|
||||
# Ordered for diff-readability: identification (camera/lens), geometry
|
||||
# (width/height of the delivered JPEG — threaded through to <img> CLS
|
||||
# attrs), exposure (composite string + components), then capture
|
||||
# circumstance (date + GPS).
|
||||
SIDECAR_KEYS = [
|
||||
"camera",
|
||||
"lens",
|
||||
"width",
|
||||
"height",
|
||||
"exposure",
|
||||
"shutter",
|
||||
"aperture",
|
||||
|
|
@ -67,12 +74,6 @@ SIDECAR_KEYS = [
|
|||
"focal-length",
|
||||
"captured",
|
||||
"geo",
|
||||
# Pixel dimensions of the delivered (resized, EXIF-stripped) JPEG.
|
||||
# Threaded through to the Hakyll photographyCtx and emitted as
|
||||
# width / height attrs on every <img> tag — prevents cumulative
|
||||
# layout shift while photos load.
|
||||
"width",
|
||||
"height",
|
||||
]
|
||||
|
||||
|
||||
|
|
@ -390,10 +391,42 @@ def _read_one(image: Path) -> dict[str, Any]:
|
|||
return _read_exif_via_pillow(image)
|
||||
|
||||
|
||||
def _process_one(image: Path, counters: dict[str, int]) -> None:
|
||||
"""Extract EXIF for a single image, updating counters.
|
||||
|
||||
Skips when the sidecar is fresher than the image; logs and counts
|
||||
failures so the caller can decide whether they're fatal.
|
||||
"""
|
||||
if image.name.startswith(".") or image.name.endswith(".tmp"):
|
||||
return
|
||||
sidecar = _sidecar_path(image)
|
||||
if not _is_stale(image, sidecar):
|
||||
counters["skipped"] += 1
|
||||
return
|
||||
try:
|
||||
data = _read_one(image)
|
||||
except Exception as e: # noqa: BLE001 — keep walking
|
||||
print(f"extract-exif: {image}: {e}", file=sys.stderr)
|
||||
counters["failed"] += 1
|
||||
return
|
||||
# Always write a sidecar — even if empty — so the consumer doesn't
|
||||
# need to branch on existence. An empty sidecar is the explicit
|
||||
# signal that "we tried; nothing to extract" (typical for film scans).
|
||||
_atomic_write_yaml(sidecar, data)
|
||||
counters["written"] += 1
|
||||
|
||||
|
||||
def main() -> int:
|
||||
if not CONTENT_DIR.exists():
|
||||
print(f"extract-exif: {CONTENT_DIR} does not exist — skipping.", file=sys.stderr)
|
||||
return 0
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Write EXIF sidecars for photography images.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--file",
|
||||
type=Path,
|
||||
help="Process a single image instead of walking content/photography/. "
|
||||
"Used by tools/import-photo.sh to avoid a full re-walk per import.",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
using_exiftool = _exiftool_available()
|
||||
print(
|
||||
|
|
@ -402,39 +435,35 @@ def main() -> int:
|
|||
file=sys.stderr,
|
||||
)
|
||||
|
||||
written = 0
|
||||
skipped = 0
|
||||
failed = 0
|
||||
counters = {"written": 0, "skipped": 0, "failed": 0}
|
||||
|
||||
if args.file is not None:
|
||||
if not args.file.exists():
|
||||
print(f"extract-exif: --file {args.file} does not exist", file=sys.stderr)
|
||||
return 1
|
||||
if args.file.suffix.lower() not in IMAGE_EXTS:
|
||||
print(
|
||||
f"extract-exif: --file {args.file}: unsupported extension"
|
||||
f" (expected one of {sorted(IMAGE_EXTS)})",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 1
|
||||
_process_one(args.file, counters)
|
||||
else:
|
||||
if not CONTENT_DIR.exists():
|
||||
print(
|
||||
f"extract-exif: {CONTENT_DIR} does not exist — skipping.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 0
|
||||
for image in sorted(CONTENT_DIR.rglob("*")):
|
||||
if image.suffix.lower() not in IMAGE_EXTS:
|
||||
continue
|
||||
# Skip the WebP companions (extension wouldn't match anyway, but
|
||||
# be explicit) and any tmp / hidden files.
|
||||
if image.name.startswith(".") or image.name.endswith(".tmp"):
|
||||
continue
|
||||
|
||||
sidecar = _sidecar_path(image)
|
||||
if not _is_stale(image, sidecar):
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
try:
|
||||
data = _read_one(image)
|
||||
except Exception as e: # noqa: BLE001 — keep walking
|
||||
print(f"extract-exif: {image}: {e}", file=sys.stderr)
|
||||
failed += 1
|
||||
continue
|
||||
|
||||
# Always write a sidecar — even if it's empty — so the consumer
|
||||
# doesn't need to branch on existence. An empty sidecar is the
|
||||
# explicit signal that "we tried; nothing to extract" (typical
|
||||
# for film scans).
|
||||
_atomic_write_yaml(sidecar, data)
|
||||
written += 1
|
||||
_process_one(image, counters)
|
||||
|
||||
print(
|
||||
f"extract-exif: {written} written, {skipped} skipped, {failed} failed",
|
||||
f"extract-exif: {counters['written']} written, "
|
||||
f"{counters['skipped']} skipped, {counters['failed']} failed",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 0
|
||||
|
|
|
|||
|
|
@ -23,6 +23,7 @@ a palette extraction error.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
|
@ -35,8 +36,8 @@ CONTENT_DIR = REPO_ROOT / "content" / "photography"
|
|||
|
||||
IMAGE_EXTS = {".jpg", ".jpeg", ".png"}
|
||||
|
||||
# Number of swatches in the rendered strip. Five matches the design in
|
||||
# PHOTOGRAPHY.md and the existing `photo-palette` CSS, which sets
|
||||
# Number of swatches in the rendered strip. Five matches the existing
|
||||
# `photo-palette` CSS in static/css/photography.css, which sets
|
||||
# `display: flex; height: 0.75rem;` and divides the bar evenly. Bumping
|
||||
# this requires a CSS revisit — the bar reads as a unified strip up to
|
||||
# about 7 swatches; beyond that the bands become too narrow to perceive.
|
||||
|
|
@ -77,41 +78,65 @@ def _extract_palette(image: Path) -> list[str]:
|
|||
return [_hex(rgb) for rgb in palette[:N_SWATCHES]]
|
||||
|
||||
|
||||
def _process_one(image: Path, counters: dict[str, int]) -> None:
|
||||
"""Extract a palette for one image, updating counters."""
|
||||
if image.name.startswith(".") or image.name.endswith(".tmp"):
|
||||
return
|
||||
sidecar = _sidecar_path(image)
|
||||
if not _is_stale(image, sidecar):
|
||||
counters["skipped"] += 1
|
||||
return
|
||||
try:
|
||||
palette = _extract_palette(image)
|
||||
except Exception as e: # noqa: BLE001 — keep walking
|
||||
print(f"extract-palette: {image}: {e}", file=sys.stderr)
|
||||
counters["failed"] += 1
|
||||
return
|
||||
_atomic_write_yaml(sidecar, {"palette": palette})
|
||||
counters["written"] += 1
|
||||
|
||||
|
||||
def main() -> int:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Write 5-color palette sidecars for photography images.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--file",
|
||||
type=Path,
|
||||
help="Process a single image instead of walking content/photography/. "
|
||||
"Used by tools/import-photo.sh to avoid a full re-walk per import.",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
counters = {"written": 0, "skipped": 0, "failed": 0}
|
||||
|
||||
if args.file is not None:
|
||||
if not args.file.exists():
|
||||
print(f"extract-palette: --file {args.file} does not exist", file=sys.stderr)
|
||||
return 1
|
||||
if args.file.suffix.lower() not in IMAGE_EXTS:
|
||||
print(
|
||||
f"extract-palette: --file {args.file}: unsupported extension"
|
||||
f" (expected one of {sorted(IMAGE_EXTS)})",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 1
|
||||
_process_one(args.file, counters)
|
||||
else:
|
||||
if not CONTENT_DIR.exists():
|
||||
print(
|
||||
f"extract-palette: {CONTENT_DIR} does not exist — skipping.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 0
|
||||
|
||||
written = 0
|
||||
skipped = 0
|
||||
failed = 0
|
||||
|
||||
for image in sorted(CONTENT_DIR.rglob("*")):
|
||||
if image.suffix.lower() not in IMAGE_EXTS:
|
||||
continue
|
||||
if image.name.startswith(".") or image.name.endswith(".tmp"):
|
||||
continue
|
||||
|
||||
sidecar = _sidecar_path(image)
|
||||
if not _is_stale(image, sidecar):
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
try:
|
||||
palette = _extract_palette(image)
|
||||
except Exception as e: # noqa: BLE001 — keep walking
|
||||
print(f"extract-palette: {image}: {e}", file=sys.stderr)
|
||||
failed += 1
|
||||
continue
|
||||
|
||||
_atomic_write_yaml(sidecar, {"palette": palette})
|
||||
written += 1
|
||||
_process_one(image, counters)
|
||||
|
||||
print(
|
||||
f"extract-palette: {written} written, {skipped} skipped, {failed} failed",
|
||||
f"extract-palette: {counters['written']} written, "
|
||||
f"{counters['skipped']} skipped, {counters['failed']} failed",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 0
|
||||
|
|
|
|||
|
|
@ -26,8 +26,8 @@
|
|||
# * uv + .venv (Pillow + colorthief + pyyaml) for sidecar extraction
|
||||
#
|
||||
# Originals are NEVER copied into the repo verbatim — only the resized
|
||||
# delivery JPEG. Per PHOTOGRAPHY.md, originals live outside source
|
||||
# control (your local archive, NAS, or backup).
|
||||
# delivery JPEG. RAW / TIFF / PSD originals live outside source control
|
||||
# (your local archive, NAS, or backup) and are refused by .gitignore.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
|
|
@ -126,7 +126,7 @@ chmod 644 "$TARGET"
|
|||
# ---------------------------------------------------------------------------
|
||||
|
||||
echo "import-photo: extracting EXIF sidecar..."
|
||||
( cd "$REPO_ROOT" && uv run python tools/extract-exif.py ) || true
|
||||
( cd "$REPO_ROOT" && uv run python tools/extract-exif.py --file "$TARGET" ) || true
|
||||
|
||||
if [ ! -f "$EXIF_SIDECAR" ]; then
|
||||
# Empty sidecar so the consuming Hakyll field has something to read
|
||||
|
|
@ -144,11 +144,11 @@ echo "import-photo: stripping EXIF from delivered file..."
|
|||
magick mogrify -strip "$TARGET"
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Step 4: extract palette (does its own walk; idempotent on already-done photos)
|
||||
# Step 4: extract palette for the new photo only (no full re-walk)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
echo "import-photo: extracting palette sidecar..."
|
||||
( cd "$REPO_ROOT" && uv run python tools/extract-palette.py ) || true
|
||||
( cd "$REPO_ROOT" && uv run python tools/extract-palette.py --file "$TARGET" ) || true
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Step 5: scaffold index.md
|
||||
|
|
|
|||
|
|
@ -0,0 +1,5 @@
|
|||
7135149f7cffa1a573466c6e4d8423ed73b62fd2332c575bf738a0d033f70df7 config.json
|
||||
da0e79933b9ed51798a3ae27893d3c5fa4a201126cef75586296df9b4d2c62a0 tokenizer.json
|
||||
9261e7d79b44c8195c1cada2b453e55b00aeb81e907a6664974b4d7776172ab3 tokenizer_config.json
|
||||
b6d346be366a7d1d48332dbc9fdf3bf8960b5d879522b7799ddba59e76237ee3 special_tokens_map.json
|
||||
afdb6f1a0e45b715d0bb9b11772f032c399babd23bfc31fed1c170afc848bdb1 onnx/model_quantized.onnx
|
||||
|
|
@ -0,0 +1 @@
|
|||
0555ef47464e456125dcd0b9742cfa2aaed6d282e804e4dd1f1b99316b46ac00 pdfjs-5.6.205-dist.zip
|
||||
|
|
@ -0,0 +1,127 @@
|
|||
#!/usr/bin/env bash
|
||||
# preset-signing-passphrase.sh — Cache the signing subkey passphrase in the
|
||||
# dedicated signing agent so that `make sign` can run without a prompt.
|
||||
#
|
||||
# Reads the master signing-key fingerprint from site.yaml's `gpg-fingerprint`
|
||||
# field, derives the signing subkey's keygrip via `gpg --with-keygrip`, and
|
||||
# hands the passphrase to gpg-preset-passphrase. Cached for the TTL configured
|
||||
# in $GNUPGHOME/gpg-agent.conf (typically 24 h via max-cache-ttl).
|
||||
#
|
||||
# Run ONCE in an interactive terminal after system boot or after the agent
|
||||
# cache expires.
|
||||
#
|
||||
# Usage:
|
||||
# ./tools/preset-signing-passphrase.sh
|
||||
#
|
||||
# Environment:
|
||||
# GNUPGHOME — defaults to $HOME/.gnupg-signing (matches sign-site.sh).
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
SITE_YAML="$REPO_ROOT/site.yaml"
|
||||
|
||||
GNUPGHOME="${GNUPGHOME:-$HOME/.gnupg-signing}"
|
||||
|
||||
# gpg-preset-passphrase ships in different paths depending on distro.
|
||||
PRESET_CANDIDATES=(
|
||||
/usr/lib/gnupg/gpg-preset-passphrase
|
||||
/usr/libexec/gpg-preset-passphrase
|
||||
/usr/lib/gnupg2/gpg-preset-passphrase
|
||||
)
|
||||
GPG_PRESET=""
|
||||
for candidate in "${PRESET_CANDIDATES[@]}"; do
|
||||
if [ -x "$candidate" ]; then
|
||||
GPG_PRESET="$candidate"
|
||||
break
|
||||
fi
|
||||
done
|
||||
if [ -z "$GPG_PRESET" ]; then
|
||||
echo "Error: gpg-preset-passphrase not found in any of:" >&2
|
||||
printf ' %s\n' "${PRESET_CANDIDATES[@]}" >&2
|
||||
echo "Install gnupg's utilities package (e.g. on Debian: apt install gnupg-agent)." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Read the master fingerprint from site.yaml
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
if [ ! -f "$SITE_YAML" ]; then
|
||||
echo "Error: $SITE_YAML not found." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Minimal parser: looks for a line of the form `gpg-fingerprint: "..."` or
|
||||
# `gpg-fingerprint: ...`. Strips quotes and surrounding whitespace.
|
||||
FINGERPRINT="$(awk -F: '
|
||||
/^[[:space:]]*gpg-fingerprint[[:space:]]*:/ {
|
||||
sub(/^[^:]*:[[:space:]]*/, "")
|
||||
gsub(/^["'"'"' ]+|["'"'"' ]+$/, "")
|
||||
print
|
||||
exit
|
||||
}
|
||||
' "$SITE_YAML")"
|
||||
|
||||
if [ -z "$FINGERPRINT" ]; then
|
||||
echo "Error: gpg-fingerprint is empty in $SITE_YAML." >&2
|
||||
echo " Set it to the 40-character master-key fingerprint of your signing key," >&2
|
||||
echo " then re-run this script." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Derive the signing subkey's keygrip
|
||||
# ---------------------------------------------------------------------------
|
||||
#
|
||||
# `gpg --with-keygrip --list-secret-keys <fpr>` output groups each subkey
|
||||
# with its capability flags `[...]` and a `Keygrip = ...` line. We want the
|
||||
# keygrip of the first subkey whose capability set contains S (sign).
|
||||
|
||||
KEYGRIP="$(GNUPGHOME="$GNUPGHOME" gpg \
|
||||
--homedir "$GNUPGHOME" \
|
||||
--with-keygrip \
|
||||
--with-colons \
|
||||
--list-secret-keys "$FINGERPRINT" 2>/dev/null \
|
||||
| awk -F: '
|
||||
$1 == "ssb" && $12 ~ /s/ { want = 1; next }
|
||||
want && $1 == "grp" { print $10; exit }
|
||||
')"
|
||||
|
||||
# Fallback: if no signing subkey was found, try the primary key (some users
|
||||
# sign with the primary instead of a subkey).
|
||||
if [ -z "$KEYGRIP" ]; then
|
||||
KEYGRIP="$(GNUPGHOME="$GNUPGHOME" gpg \
|
||||
--homedir "$GNUPGHOME" \
|
||||
--with-keygrip \
|
||||
--with-colons \
|
||||
--list-secret-keys "$FINGERPRINT" 2>/dev/null \
|
||||
| awk -F: '
|
||||
$1 == "sec" && $12 ~ /s/ { want = 1; next }
|
||||
want && $1 == "grp" { print $10; exit }
|
||||
')"
|
||||
fi
|
||||
|
||||
if [ -z "$KEYGRIP" ]; then
|
||||
echo "Error: could not derive a signing keygrip for $FINGERPRINT" >&2
|
||||
echo " Verify the key exists in GNUPGHOME=$GNUPGHOME:" >&2
|
||||
echo " GNUPGHOME=$GNUPGHOME gpg --list-secret-keys $FINGERPRINT" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Ensure agent is running, then preset
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
GNUPGHOME="$GNUPGHOME" gpg-connect-agent --homedir "$GNUPGHOME" /bye >/dev/null 2>&1 || true
|
||||
|
||||
echo -n "Signing-key passphrase (will not echo): "
|
||||
read -rs PASSPHRASE
|
||||
echo
|
||||
|
||||
echo -n "$PASSPHRASE" | GNUPGHOME="$GNUPGHOME" \
|
||||
"$GPG_PRESET" --homedir "$GNUPGHOME" --preset "$KEYGRIP"
|
||||
|
||||
echo "Passphrase cached for keygrip $KEYGRIP."
|
||||
echo "Verify with:"
|
||||
echo " GNUPGHOME=$GNUPGHOME gpg --homedir $GNUPGHOME --batch --detach-sign --armor --output /dev/null /dev/null"
|
||||
|
|
@ -9,9 +9,37 @@
|
|||
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
SITE_YAML="$REPO_ROOT/site.yaml"
|
||||
|
||||
GNUPGHOME="${GNUPGHOME:-$HOME/.gnupg-signing}"
|
||||
SITE_DIR="${1:-_site}"
|
||||
SIGNING_KEY="C9A42A6FAD444FBE566FD738531BDC1CC2707066"
|
||||
|
||||
# Read the master signing-key fingerprint from site.yaml's `gpg-fingerprint`
|
||||
# field. Single source of truth — same value the templates render in the
|
||||
# footer's "sig" tooltip. Empty fingerprint means signing isn't configured;
|
||||
# bail with a helpful message rather than calling gpg with an empty key.
|
||||
if [ ! -f "$SITE_YAML" ]; then
|
||||
echo "Error: $SITE_YAML not found." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
SIGNING_KEY="$(awk -F: '
|
||||
/^[[:space:]]*gpg-fingerprint[[:space:]]*:/ {
|
||||
sub(/^[^:]*:[[:space:]]*/, "")
|
||||
gsub(/^["'"'"' ]+|["'"'"' ]+$/, "")
|
||||
print
|
||||
exit
|
||||
}
|
||||
' "$SITE_YAML")"
|
||||
|
||||
if [ -z "$SIGNING_KEY" ]; then
|
||||
echo "Error: gpg-fingerprint is empty in $SITE_YAML." >&2
|
||||
echo " Set it to your master signing-key fingerprint, then retry 'make sign'." >&2
|
||||
echo " (Leave it empty to skip signing entirely — and remove 'sign' from your" >&2
|
||||
echo " deploy pipeline accordingly.)" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -d "$SITE_DIR" ]; then
|
||||
echo "Error: site directory '$SITE_DIR' not found. Run 'make build' first." >&2
|
||||
|
|
|
|||
|
|
@ -1,16 +1,95 @@
|
|||
#!/usr/bin/env bash
|
||||
# Subsets and converts TTF system fonts to WOFF2 for web use.
|
||||
# subset-fonts.sh — Subset and convert TTF system fonts to WOFF2 for web use.
|
||||
#
|
||||
# Requires: fonttools (pyftsubset) — `pip install fonttools brotli`
|
||||
# Output: static/fonts/
|
||||
#
|
||||
# ─── Font directories ────────────────────────────────────────────────────────
|
||||
# Override any of these via the matching env var when your distro lays the
|
||||
# fonts out differently. The script tries each candidate path in order and
|
||||
# fails with a clear message if none resolve.
|
||||
#
|
||||
# SPECTRAL_DIR Spectral-{Regular,Italic,SemiBold,SemiBoldItalic,Bold,BoldItalic}.ttf
|
||||
# FIRA_DIR FiraSans-{Regular,SemiBold}.ttf
|
||||
# JBM_DIR JetBrainsMono-{Regular,Italic}.ttf
|
||||
#
|
||||
# Common locations:
|
||||
# Arch: /usr/share/fonts/ttf-spectral, /usr/share/fonts/TTF
|
||||
# Debian: /usr/share/fonts/truetype/spectral, /usr/share/fonts/truetype/firasans,
|
||||
# /usr/share/fonts/truetype/jetbrains-mono
|
||||
# macOS (brew): /opt/homebrew/share/fonts, ~/Library/Fonts
|
||||
# Manual: wherever you unzipped the upstream releases
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
OUTDIR="$(dirname "$0")/../static/fonts"
|
||||
mkdir -p "$OUTDIR"
|
||||
|
||||
SPECTRAL="/usr/share/fonts/ttf-spectral"
|
||||
FIRA="/usr/share/fonts/TTF"
|
||||
JBM="/usr/share/fonts/TTF"
|
||||
# ---------------------------------------------------------------------------
|
||||
# Locate font directories (env var override > candidate list > error)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
# Resolve a font dir by checking, in order:
|
||||
# 1. The override env var (passed by name), if non-empty
|
||||
# 2. Each candidate path
|
||||
# Sets the global RESOLVED_DIR on success; exits with a descriptive message
|
||||
# on failure. Sets the global rather than echoing because `exit 1` inside a
|
||||
# `$(...)` substitution only exits the subshell, not the script.
|
||||
resolve_font_dir() {
|
||||
local label="$1" # human-readable name, e.g. "Spectral"
|
||||
local env_var="$2" # env var name, e.g. "SPECTRAL_DIR"
|
||||
shift 2
|
||||
local candidates=("$@")
|
||||
|
||||
# Indirection: ${!env_var} reads the env var named in $env_var.
|
||||
if [ -n "${!env_var:-}" ]; then
|
||||
if [ ! -d "${!env_var}" ]; then
|
||||
echo "Error: $env_var=${!env_var} is not a directory." >&2
|
||||
exit 1
|
||||
fi
|
||||
RESOLVED_DIR="${!env_var}"
|
||||
return
|
||||
fi
|
||||
|
||||
for dir in "${candidates[@]}"; do
|
||||
if [ -d "$dir" ]; then
|
||||
RESOLVED_DIR="$dir"
|
||||
return
|
||||
fi
|
||||
done
|
||||
|
||||
echo "Error: $label fonts not found in any of:" >&2
|
||||
printf ' %s\n' "${candidates[@]}" >&2
|
||||
echo "Set $env_var to point at the directory containing the .ttf files." >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
resolve_font_dir "Spectral" "SPECTRAL_DIR" \
|
||||
/usr/share/fonts/ttf-spectral \
|
||||
/usr/share/fonts/truetype/spectral \
|
||||
/opt/homebrew/share/fonts \
|
||||
"$HOME/Library/Fonts"
|
||||
SPECTRAL="$RESOLVED_DIR"
|
||||
|
||||
resolve_font_dir "Fira Sans" "FIRA_DIR" \
|
||||
/usr/share/fonts/TTF \
|
||||
/usr/share/fonts/truetype/firasans \
|
||||
/usr/share/fonts/truetype/fira-sans \
|
||||
/opt/homebrew/share/fonts \
|
||||
"$HOME/Library/Fonts"
|
||||
FIRA="$RESOLVED_DIR"
|
||||
|
||||
resolve_font_dir "JetBrains Mono" "JBM_DIR" \
|
||||
/usr/share/fonts/TTF \
|
||||
/usr/share/fonts/truetype/jetbrains-mono \
|
||||
/usr/share/fonts/truetype/jetbrainsmono \
|
||||
/opt/homebrew/share/fonts \
|
||||
"$HOME/Library/Fonts"
|
||||
JBM="$RESOLVED_DIR"
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Subset configuration
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
# Unicode range: Latin, Latin Extended-A, General Punctuation, common symbols
|
||||
UNICODES="U+0000-00FF,U+0131,U+0152-0153,U+02BB-02BC,U+02C6,U+02DA,U+02DC,U+2000-206F,U+2074,U+20AC,U+2122,U+2191,U+2193,U+2212,U+2215,U+FEFF,U+FFFD"
|
||||
|
|
@ -19,8 +98,18 @@ SPECTRAL_FEATURES="liga,dlig,smcp,c2sc,onum,lnum,pnum,tnum,frac,ordn,sups,subs,s
|
|||
FIRA_FEATURES="smcp,liga,kern"
|
||||
JBM_FEATURES="liga,kern,calt"
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Subset helper — fails fast if a source file is missing
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
subset() {
|
||||
local src="$1" dest="$2" features="$3"
|
||||
if [ ! -f "$src" ]; then
|
||||
echo "Error: source font not found: $src" >&2
|
||||
echo " Adjust the directory env var (SPECTRAL_DIR / FIRA_DIR / JBM_DIR)" >&2
|
||||
echo " or install the corresponding font package." >&2
|
||||
exit 1
|
||||
fi
|
||||
echo " → $(basename "$dest")"
|
||||
pyftsubset "$src" \
|
||||
--output-file="$dest" \
|
||||
|
|
@ -31,7 +120,11 @@ subset() {
|
|||
--desubroutinize
|
||||
}
|
||||
|
||||
echo "Spectral..."
|
||||
# ---------------------------------------------------------------------------
|
||||
# Run
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
echo "Spectral... ($SPECTRAL)"
|
||||
subset "$SPECTRAL/Spectral-Regular.ttf" "$OUTDIR/spectral-regular.woff2" "$SPECTRAL_FEATURES"
|
||||
subset "$SPECTRAL/Spectral-Italic.ttf" "$OUTDIR/spectral-italic.woff2" "$SPECTRAL_FEATURES"
|
||||
subset "$SPECTRAL/Spectral-SemiBold.ttf" "$OUTDIR/spectral-semibold.woff2" "$SPECTRAL_FEATURES"
|
||||
|
|
@ -39,11 +132,11 @@ subset "$SPECTRAL/Spectral-SemiBoldItalic.ttf" "$OUTDIR/spectral-semibold-itali
|
|||
subset "$SPECTRAL/Spectral-Bold.ttf" "$OUTDIR/spectral-bold.woff2" "$SPECTRAL_FEATURES"
|
||||
subset "$SPECTRAL/Spectral-BoldItalic.ttf" "$OUTDIR/spectral-bold-italic.woff2" "$SPECTRAL_FEATURES"
|
||||
|
||||
echo "Fira Sans..."
|
||||
echo "Fira Sans... ($FIRA)"
|
||||
subset "$FIRA/FiraSans-Regular.ttf" "$OUTDIR/fira-sans-regular.woff2" "$FIRA_FEATURES"
|
||||
subset "$FIRA/FiraSans-SemiBold.ttf" "$OUTDIR/fira-sans-semibold.woff2" "$FIRA_FEATURES"
|
||||
|
||||
echo "JetBrains Mono..."
|
||||
echo "JetBrains Mono... ($JBM)"
|
||||
subset "$JBM/JetBrainsMono-Regular.ttf" "$OUTDIR/jetbrains-mono-regular.woff2" "$JBM_FEATURES"
|
||||
subset "$JBM/JetBrainsMono-Italic.ttf" "$OUTDIR/jetbrains-mono-italic.woff2" "$JBM_FEATURES"
|
||||
|
||||
|
|
|
|||
265
uv.lock
265
uv.lock
|
|
@ -135,6 +135,18 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "colorthief"
|
||||
version = "0.2.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pillow" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b6/b2/b55b741f7a7d1299d23e1c635f00f6c57ea4d2e9b76d09e1fc5ea3ca9921/colorthief-0.2.1.tar.gz", hash = "sha256:079cb0c95bdd669c4643e2f7494de13b0b6029d5cdbe2d74d5d3c3386bd57221", size = 6164, upload-time = "2017-02-09T07:13:15.7Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/56/18/be03b7058e65f9df479b14e7af4e73945ce311e07aaad45cf2536e14791a/colorthief-0.2.1-py2.py3-none-any.whl", hash = "sha256:b04fc8ce5cf9c888768745e29cb19b7b688d5711af6fba26e8057debabec56b9", size = 6134, upload-time = "2017-02-09T07:13:19.041Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "contourpy"
|
||||
version = "1.3.3"
|
||||
|
|
@ -486,32 +498,6 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/b5/91/53255615acd2a1eaca307ede3c90eb550bae9c94581f8c00081b6b1c8f44/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-win_amd64.whl", hash = "sha256:1f1489f769582498610e015a8ef2d36f28f505ab3096d0e16b4858a9ec214f57", size = 75987, upload-time = "2026-03-09T13:15:39.65Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "levineuwirth-tools"
|
||||
version = "0.1.0"
|
||||
source = { virtual = "." }
|
||||
dependencies = [
|
||||
{ name = "altair" },
|
||||
{ name = "beautifulsoup4" },
|
||||
{ name = "faiss-cpu" },
|
||||
{ name = "matplotlib" },
|
||||
{ name = "numpy" },
|
||||
{ name = "sentence-transformers" },
|
||||
{ name = "torch", version = "2.10.0", source = { registry = "https://download.pytorch.org/whl/cpu" }, marker = "sys_platform == 'darwin'" },
|
||||
{ name = "torch", version = "2.10.0+cpu", source = { registry = "https://download.pytorch.org/whl/cpu" }, marker = "sys_platform != 'darwin'" },
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "altair", specifier = ">=5.4,<6" },
|
||||
{ name = "beautifulsoup4", specifier = ">=4.12,<5" },
|
||||
{ name = "faiss-cpu", specifier = ">=1.9,<2" },
|
||||
{ name = "matplotlib", specifier = ">=3.9,<4" },
|
||||
{ name = "numpy", specifier = ">=2.0,<3" },
|
||||
{ name = "sentence-transformers", specifier = ">=3.4,<4" },
|
||||
{ name = "torch", specifier = ">=2.5,<3", index = "https://download.pytorch.org/whl/cpu" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "markupsafe"
|
||||
version = "3.0.3"
|
||||
|
|
@ -717,6 +703,38 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/1f/b6/7c0d4334c15983cec7f92a69e8ce9b1e6f31857e5ee3a413ac424e6bd63d/numpy-2.4.3-cp314-cp314t-win_arm64.whl", hash = "sha256:4d382735cecd7bcf090172489a525cd7d4087bc331f7df9f60ddc9a296cf208e", size = 10565454, upload-time = "2026-03-09T07:58:33.031Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ozymandias-tools"
|
||||
version = "0.1.0"
|
||||
source = { virtual = "." }
|
||||
dependencies = [
|
||||
{ name = "altair" },
|
||||
{ name = "beautifulsoup4" },
|
||||
{ name = "colorthief" },
|
||||
{ name = "faiss-cpu" },
|
||||
{ name = "matplotlib" },
|
||||
{ name = "numpy" },
|
||||
{ name = "pillow" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "sentence-transformers" },
|
||||
{ name = "torch", version = "2.10.0", source = { registry = "https://download.pytorch.org/whl/cpu" }, marker = "sys_platform == 'darwin'" },
|
||||
{ name = "torch", version = "2.10.0+cpu", source = { registry = "https://download.pytorch.org/whl/cpu" }, marker = "sys_platform != 'darwin'" },
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "altair", specifier = ">=5.4,<6" },
|
||||
{ name = "beautifulsoup4", specifier = ">=4.12,<5" },
|
||||
{ name = "colorthief", specifier = ">=0.2,<1" },
|
||||
{ name = "faiss-cpu", specifier = ">=1.9,<2" },
|
||||
{ name = "matplotlib", specifier = ">=3.9,<4" },
|
||||
{ name = "numpy", specifier = ">=2.0,<3" },
|
||||
{ name = "pillow", specifier = ">=10.0,<12" },
|
||||
{ name = "pyyaml", specifier = ">=6.0,<7" },
|
||||
{ name = "sentence-transformers", specifier = ">=3.4,<4" },
|
||||
{ name = "torch", specifier = ">=2.5,<3", index = "https://download.pytorch.org/whl/cpu" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "packaging"
|
||||
version = "26.0"
|
||||
|
|
@ -728,71 +746,68 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "pillow"
|
||||
version = "12.1.1"
|
||||
version = "11.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/1f/42/5c74462b4fd957fcd7b13b04fb3205ff8349236ea74c7c375766d6c82288/pillow-12.1.1.tar.gz", hash = "sha256:9ad8fa5937ab05218e2b6a4cff30295ad35afd2f83ac592e68c0d871bb0fdbc4", size = 46980264, upload-time = "2026-02-11T04:23:07.146Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/d0d6dea55cd152ce3d6767bb38a8fc10e33796ba4ba210cbab9354b6d238/pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523", size = 47113069, upload-time = "2025-07-01T09:16:30.666Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/07/d3/8df65da0d4df36b094351dce696f2989bec731d4f10e743b1c5f4da4d3bf/pillow-12.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ab323b787d6e18b3d91a72fc99b1a2c28651e4358749842b8f8dfacd28ef2052", size = 5262803, upload-time = "2026-02-11T04:20:47.653Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/71/5026395b290ff404b836e636f51d7297e6c83beceaa87c592718747e670f/pillow-12.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:adebb5bee0f0af4909c30db0d890c773d1a92ffe83da908e2e9e720f8edf3984", size = 4657601, upload-time = "2026-02-11T04:20:49.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/2e/1001613d941c67442f745aff0f7cc66dd8df9a9c084eb497e6a543ee6f7e/pillow-12.1.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:bb66b7cc26f50977108790e2456b7921e773f23db5630261102233eb355a3b79", size = 6234995, upload-time = "2026-02-11T04:20:51.032Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/26/246ab11455b2549b9233dbd44d358d033a2f780fa9007b61a913c5b2d24e/pillow-12.1.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:aee2810642b2898bb187ced9b349e95d2a7272930796e022efaf12e99dccd293", size = 8045012, upload-time = "2026-02-11T04:20:52.882Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/8b/07587069c27be7535ac1fe33874e32de118fbd34e2a73b7f83436a88368c/pillow-12.1.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a0b1cd6232e2b618adcc54d9882e4e662a089d5768cd188f7c245b4c8c44a397", size = 6349638, upload-time = "2026-02-11T04:20:54.444Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/79/6df7b2ee763d619cda2fb4fea498e5f79d984dae304d45a8999b80d6cf5c/pillow-12.1.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7aac39bcf8d4770d089588a2e1dd111cbaa42df5a94be3114222057d68336bd0", size = 7041540, upload-time = "2026-02-11T04:20:55.97Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/5e/2ba19e7e7236d7529f4d873bdaf317a318896bac289abebd4bb00ef247f0/pillow-12.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ab174cd7d29a62dd139c44bf74b698039328f45cb03b4596c43473a46656b2f3", size = 6462613, upload-time = "2026-02-11T04:20:57.542Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/03/31216ec124bb5c3dacd74ce8efff4cc7f52643653bad4825f8f08c697743/pillow-12.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:339ffdcb7cbeaa08221cd401d517d4b1fe7a9ed5d400e4a8039719238620ca35", size = 7166745, upload-time = "2026-02-11T04:20:59.196Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/e7/7c4552d80052337eb28653b617eafdef39adfb137c49dd7e831b8dc13bc5/pillow-12.1.1-cp312-cp312-win32.whl", hash = "sha256:5d1f9575a12bed9e9eedd9a4972834b08c97a352bd17955ccdebfeca5913fa0a", size = 6328823, upload-time = "2026-02-11T04:21:01.385Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/17/688626d192d7261bbbf98846fc98995726bddc2c945344b65bec3a29d731/pillow-12.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:21329ec8c96c6e979cd0dfd29406c40c1d52521a90544463057d2aaa937d66a6", size = 7033367, upload-time = "2026-02-11T04:21:03.536Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/fe/a0ef1f73f939b0eca03ee2c108d0043a87468664770612602c63266a43c4/pillow-12.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:af9a332e572978f0218686636610555ae3defd1633597be015ed50289a03c523", size = 2453811, upload-time = "2026-02-11T04:21:05.116Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/11/6db24d4bd7685583caeae54b7009584e38da3c3d4488ed4cd25b439de486/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:d242e8ac078781f1de88bf823d70c1a9b3c7950a44cdf4b7c012e22ccbcd8e4e", size = 4062689, upload-time = "2026-02-11T04:21:06.804Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/c0/ce6d3b1fe190f0021203e0d9b5b99e57843e345f15f9ef22fcd43842fd21/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:02f84dfad02693676692746df05b89cf25597560db2857363a208e393429f5e9", size = 4138535, upload-time = "2026-02-11T04:21:08.452Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/c6/d5eb6a4fb32a3f9c21a8c7613ec706534ea1cf9f4b3663e99f0d83f6fca8/pillow-12.1.1-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:e65498daf4b583091ccbb2556c7000abf0f3349fcd57ef7adc9a84a394ed29f6", size = 3601364, upload-time = "2026-02-11T04:21:10.194Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/a1/16c4b823838ba4c9c52c0e6bbda903a3fe5a1bdbf1b8eb4fff7156f3e318/pillow-12.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6c6db3b84c87d48d0088943bf33440e0c42370b99b1c2a7989216f7b42eede60", size = 5262561, upload-time = "2026-02-11T04:21:11.742Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/ad/ad9dc98ff24f485008aa5cdedaf1a219876f6f6c42a4626c08bc4e80b120/pillow-12.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8b7e5304e34942bf62e15184219a7b5ad4ff7f3bb5cca4d984f37df1a0e1aee2", size = 4657460, upload-time = "2026-02-11T04:21:13.786Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/1b/f1a4ea9a895b5732152789326202a82464d5254759fbacae4deea3069334/pillow-12.1.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:18e5bddd742a44b7e6b1e773ab5db102bd7a94c32555ba656e76d319d19c3850", size = 6232698, upload-time = "2026-02-11T04:21:15.949Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/f4/86f51b8745070daf21fd2e5b1fe0eb35d4db9ca26e6d58366562fb56a743/pillow-12.1.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc44ef1f3de4f45b50ccf9136999d71abb99dca7706bc75d222ed350b9fd2289", size = 8041706, upload-time = "2026-02-11T04:21:17.723Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/9b/d6ecd956bb1266dd1045e995cce9b8d77759e740953a1c9aad9502a0461e/pillow-12.1.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5a8eb7ed8d4198bccbd07058416eeec51686b498e784eda166395a23eb99138e", size = 6346621, upload-time = "2026-02-11T04:21:19.547Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/24/538bff45bde96535d7d998c6fed1a751c75ac7c53c37c90dc2601b243893/pillow-12.1.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:47b94983da0c642de92ced1702c5b6c292a84bd3a8e1d1702ff923f183594717", size = 7038069, upload-time = "2026-02-11T04:21:21.378Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/0e/58cb1a6bc48f746bc4cb3adb8cabff73e2742c92b3bf7a220b7cf69b9177/pillow-12.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:518a48c2aab7ce596d3bf79d0e275661b846e86e4d0e7dec34712c30fe07f02a", size = 6460040, upload-time = "2026-02-11T04:21:23.148Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/57/9045cb3ff11eeb6c1adce3b2d60d7d299d7b273a2e6c8381a524abfdc474/pillow-12.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a550ae29b95c6dc13cf69e2c9dc5747f814c54eeb2e32d683e5e93af56caa029", size = 7164523, upload-time = "2026-02-11T04:21:25.01Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/f2/9be9cb99f2175f0d4dbadd6616ce1bf068ee54a28277ea1bf1fbf729c250/pillow-12.1.1-cp313-cp313-win32.whl", hash = "sha256:a003d7422449f6d1e3a34e3dd4110c22148336918ddbfc6a32581cd54b2e0b2b", size = 6332552, upload-time = "2026-02-11T04:21:27.238Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/eb/b0834ad8b583d7d9d42b80becff092082a1c3c156bb582590fcc973f1c7c/pillow-12.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:344cf1e3dab3be4b1fa08e449323d98a2a3f819ad20f4b22e77a0ede31f0faa1", size = 7040108, upload-time = "2026-02-11T04:21:29.462Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/7d/fc09634e2aabdd0feabaff4a32f4a7d97789223e7c2042fd805ea4b4d2c2/pillow-12.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:5c0dd1636633e7e6a0afe7bf6a51a14992b7f8e60de5789018ebbdfae55b040a", size = 2453712, upload-time = "2026-02-11T04:21:31.072Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/19/2a/b9d62794fc8a0dd14c1943df68347badbd5511103e0d04c035ffe5cf2255/pillow-12.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0330d233c1a0ead844fc097a7d16c0abff4c12e856c0b325f231820fee1f39da", size = 5264880, upload-time = "2026-02-11T04:21:32.865Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/9d/e03d857d1347fa5ed9247e123fcd2a97b6220e15e9cb73ca0a8d91702c6e/pillow-12.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5dae5f21afb91322f2ff791895ddd8889e5e947ff59f71b46041c8ce6db790bc", size = 4660616, upload-time = "2026-02-11T04:21:34.97Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f7/ec/8a6d22afd02570d30954e043f09c32772bfe143ba9285e2fdb11284952cd/pillow-12.1.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2e0c664be47252947d870ac0d327fea7e63985a08794758aa8af5b6cb6ec0c9c", size = 6269008, upload-time = "2026-02-11T04:21:36.623Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/1d/6d875422c9f28a4a361f495a5f68d9de4a66941dc2c619103ca335fa6446/pillow-12.1.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:691ab2ac363b8217f7d31b3497108fb1f50faab2f75dfb03284ec2f217e87bf8", size = 8073226, upload-time = "2026-02-11T04:21:38.585Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/cd/134b0b6ee5eda6dc09e25e24b40fdafe11a520bc725c1d0bbaa5e00bf95b/pillow-12.1.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e9e8064fb1cc019296958595f6db671fba95209e3ceb0c4734c9baf97de04b20", size = 6380136, upload-time = "2026-02-11T04:21:40.562Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/a9/7628f013f18f001c1b98d8fffe3452f306a70dc6aba7d931019e0492f45e/pillow-12.1.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:472a8d7ded663e6162dafdf20015c486a7009483ca671cece7a9279b512fcb13", size = 7067129, upload-time = "2026-02-11T04:21:42.521Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/f8/66ab30a2193b277785601e82ee2d49f68ea575d9637e5e234faaa98efa4c/pillow-12.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:89b54027a766529136a06cfebeecb3a04900397a3590fd252160b888479517bf", size = 6491807, upload-time = "2026-02-11T04:21:44.22Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/0b/a877a6627dc8318fdb84e357c5e1a758c0941ab1ddffdafd231983788579/pillow-12.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:86172b0831b82ce4f7877f280055892b31179e1576aa00d0df3bb1bbf8c3e524", size = 7190954, upload-time = "2026-02-11T04:21:46.114Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/83/43/6f732ff85743cf746b1361b91665d9f5155e1483817f693f8d57ea93147f/pillow-12.1.1-cp313-cp313t-win32.whl", hash = "sha256:44ce27545b6efcf0fdbdceb31c9a5bdea9333e664cda58a7e674bb74608b3986", size = 6336441, upload-time = "2026-02-11T04:21:48.22Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/44/e865ef3986611bb75bfabdf94a590016ea327833f434558801122979cd0e/pillow-12.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:a285e3eb7a5a45a2ff504e31f4a8d1b12ef62e84e5411c6804a42197c1cf586c", size = 7045383, upload-time = "2026-02-11T04:21:50.015Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/c6/f4fb24268d0c6908b9f04143697ea18b0379490cb74ba9e8d41b898bd005/pillow-12.1.1-cp313-cp313t-win_arm64.whl", hash = "sha256:cc7d296b5ea4d29e6570dabeaed58d31c3fea35a633a69679fb03d7664f43fb3", size = 2456104, upload-time = "2026-02-11T04:21:51.633Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/d0/bebb3ffbf31c5a8e97241476c4cf8b9828954693ce6744b4a2326af3e16b/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:417423db963cb4be8bac3fc1204fe61610f6abeed1580a7a2cbb2fbda20f12af", size = 4062652, upload-time = "2026-02-11T04:21:53.19Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2d/c0/0e16fb0addda4851445c28f8350d8c512f09de27bbb0d6d0bbf8b6709605/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:b957b71c6b2387610f556a7eb0828afbe40b4a98036fc0d2acfa5a44a0c2036f", size = 4138823, upload-time = "2026-02-11T04:22:03.088Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/fb/6170ec655d6f6bb6630a013dd7cf7bc218423d7b5fa9071bf63dc32175ae/pillow-12.1.1-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:097690ba1f2efdeb165a20469d59d8bb03c55fb6621eb2041a060ae8ea3e9642", size = 3601143, upload-time = "2026-02-11T04:22:04.909Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/04/dc5c3f297510ba9a6837cbb318b87dd2b8f73eb41a43cc63767f65cb599c/pillow-12.1.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:2815a87ab27848db0321fb78c7f0b2c8649dee134b7f2b80c6a45c6831d75ccd", size = 5266254, upload-time = "2026-02-11T04:22:07.656Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/30/5db1236b0d6313f03ebf97f5e17cda9ca060f524b2fcc875149a8360b21c/pillow-12.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:f7ed2c6543bad5a7d5530eb9e78c53132f93dfa44a28492db88b41cdab885202", size = 4657499, upload-time = "2026-02-11T04:22:09.613Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/18/008d2ca0eb612e81968e8be0bbae5051efba24d52debf930126d7eaacbba/pillow-12.1.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:652a2c9ccfb556235b2b501a3a7cf3742148cd22e04b5625c5fe057ea3e3191f", size = 6232137, upload-time = "2026-02-11T04:22:11.434Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/f1/f14d5b8eeb4b2cd62b9f9f847eb6605f103df89ef619ac68f92f748614ea/pillow-12.1.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d6e4571eedf43af33d0fc233a382a76e849badbccdf1ac438841308652a08e1f", size = 8042721, upload-time = "2026-02-11T04:22:13.321Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/d6/17824509146e4babbdabf04d8171491fa9d776f7061ff6e727522df9bd03/pillow-12.1.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b574c51cf7d5d62e9be37ba446224b59a2da26dc4c1bb2ecbe936a4fb1a7cb7f", size = 6347798, upload-time = "2026-02-11T04:22:15.449Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/ee/c85a38a9ab92037a75615aba572c85ea51e605265036e00c5b67dfafbfe2/pillow-12.1.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a37691702ed687799de29a518d63d4682d9016932db66d4e90c345831b02fb4e", size = 7039315, upload-time = "2026-02-11T04:22:17.24Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/f3/bc8ccc6e08a148290d7523bde4d9a0d6c981db34631390dc6e6ec34cacf6/pillow-12.1.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f95c00d5d6700b2b890479664a06e754974848afaae5e21beb4d83c106923fd0", size = 6462360, upload-time = "2026-02-11T04:22:19.111Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/ab/69a42656adb1d0665ab051eec58a41f169ad295cf81ad45406963105408f/pillow-12.1.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:559b38da23606e68681337ad74622c4dbba02254fc9cb4488a305dd5975c7eeb", size = 7165438, upload-time = "2026-02-11T04:22:21.041Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/46/81f7aa8941873f0f01d4b55cc543b0a3d03ec2ee30d617a0448bf6bd6dec/pillow-12.1.1-cp314-cp314-win32.whl", hash = "sha256:03edcc34d688572014ff223c125a3f77fb08091e4607e7745002fc214070b35f", size = 6431503, upload-time = "2026-02-11T04:22:22.833Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/72/4c245f7d1044b67affc7f134a09ea619d4895333d35322b775b928180044/pillow-12.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:50480dcd74fa63b8e78235957d302d98d98d82ccbfac4c7e12108ba9ecbdba15", size = 7176748, upload-time = "2026-02-11T04:22:24.64Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/ad/8a87bdbe038c5c698736e3348af5c2194ffb872ea52f11894c95f9305435/pillow-12.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:5cb1785d97b0c3d1d1a16bc1d710c4a0049daefc4935f3a8f31f827f4d3d2e7f", size = 2544314, upload-time = "2026-02-11T04:22:26.685Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/9d/efd18493f9de13b87ede7c47e69184b9e859e4427225ea962e32e56a49bc/pillow-12.1.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:1f90cff8aa76835cba5769f0b3121a22bd4eb9e6884cfe338216e557a9a548b8", size = 5268612, upload-time = "2026-02-11T04:22:29.884Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/f1/4f42eb2b388eb2ffc660dcb7f7b556c1015c53ebd5f7f754965ef997585b/pillow-12.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1f1be78ce9466a7ee64bfda57bdba0f7cc499d9794d518b854816c41bf0aa4e9", size = 4660567, upload-time = "2026-02-11T04:22:31.799Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/54/df6ef130fa43e4b82e32624a7b821a2be1c5653a5fdad8469687a7db4e00/pillow-12.1.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:42fc1f4677106188ad9a55562bbade416f8b55456f522430fadab3cef7cd4e60", size = 6269951, upload-time = "2026-02-11T04:22:33.921Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/48/618752d06cc44bb4aae8ce0cd4e6426871929ed7b46215638088270d9b34/pillow-12.1.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98edb152429ab62a1818039744d8fbb3ccab98a7c29fc3d5fcef158f3f1f68b7", size = 8074769, upload-time = "2026-02-11T04:22:35.877Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/bd/f1d71eb39a72fa088d938655afba3e00b38018d052752f435838961127d8/pillow-12.1.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d470ab1178551dd17fdba0fef463359c41aaa613cdcd7ff8373f54be629f9f8f", size = 6381358, upload-time = "2026-02-11T04:22:37.698Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/ef/c784e20b96674ed36a5af839305f55616f8b4f8aa8eeccf8531a6e312243/pillow-12.1.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6408a7b064595afcab0a49393a413732a35788f2a5092fdc6266952ed67de586", size = 7068558, upload-time = "2026-02-11T04:22:39.597Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/cb/8059688b74422ae61278202c4e1ad992e8a2e7375227be0a21c6b87ca8d5/pillow-12.1.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5d8c41325b382c07799a3682c1c258469ea2ff97103c53717b7893862d0c98ce", size = 6493028, upload-time = "2026-02-11T04:22:42.73Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/da/e3c008ed7d2dd1f905b15949325934510b9d1931e5df999bb15972756818/pillow-12.1.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c7697918b5be27424e9ce568193efd13d925c4481dd364e43f5dff72d33e10f8", size = 7191940, upload-time = "2026-02-11T04:22:44.543Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/4a/9202e8d11714c1fc5951f2e1ef362f2d7fbc595e1f6717971d5dd750e969/pillow-12.1.1-cp314-cp314t-win32.whl", hash = "sha256:d2912fd8114fc5545aa3a4b5576512f64c55a03f3ebcca4c10194d593d43ea36", size = 6438736, upload-time = "2026-02-11T04:22:46.347Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/ca/cbce2327eb9885476b3957b2e82eb12c866a8b16ad77392864ad601022ce/pillow-12.1.1-cp314-cp314t-win_amd64.whl", hash = "sha256:4ceb838d4bd9dab43e06c363cab2eebf63846d6a4aeaea283bbdfd8f1a8ed58b", size = 7182894, upload-time = "2026-02-11T04:22:48.114Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/d2/de599c95ba0a973b94410477f8bf0b6f0b5e67360eb89bcb1ad365258beb/pillow-12.1.1-cp314-cp314t-win_arm64.whl", hash = "sha256:7b03048319bfc6170e93bd60728a1af51d3dd7704935feb228c4d4faab35d334", size = 2546446, upload-time = "2026-02-11T04:22:50.342Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/fe/1bc9b3ee13f68487a99ac9529968035cca2f0a51ec36892060edcc51d06a/pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4", size = 5278800, upload-time = "2025-07-01T09:14:17.648Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/32/7e2ac19b5713657384cec55f89065fb306b06af008cfd87e572035b27119/pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69", size = 4686296, upload-time = "2025-07-01T09:14:19.828Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/1e/b9e12bbe6e4c2220effebc09ea0923a07a6da1e1f1bfbc8d7d29a01ce32b/pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d", size = 5871726, upload-time = "2025-07-03T13:10:04.448Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/33/e9200d2bd7ba00dc3ddb78df1198a6e80d7669cce6c2bdbeb2530a74ec58/pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6", size = 7644652, upload-time = "2025-07-03T13:10:10.391Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/f1/6f2427a26fc683e00d985bc391bdd76d8dd4e92fac33d841127eb8fb2313/pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7", size = 5977787, upload-time = "2025-07-01T09:14:21.63Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/c9/06dd4a38974e24f932ff5f98ea3c546ce3f8c995d3f0985f8e5ba48bba19/pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024", size = 6645236, upload-time = "2025-07-01T09:14:23.321Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/e7/848f69fb79843b3d91241bad658e9c14f39a32f71a301bcd1d139416d1be/pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809", size = 6086950, upload-time = "2025-07-01T09:14:25.237Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/1a/7cff92e695a2a29ac1958c2a0fe4c0b2393b60aac13b04a4fe2735cad52d/pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d", size = 6723358, upload-time = "2025-07-01T09:14:27.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/7d/73699ad77895f69edff76b0f332acc3d497f22f5d75e5360f78cbcaff248/pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149", size = 6275079, upload-time = "2025-07-01T09:14:30.104Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/ce/e7dfc873bdd9828f3b6e5c2bbb74e47a98ec23cc5c74fc4e54462f0d9204/pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d", size = 6986324, upload-time = "2025-07-01T09:14:31.899Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/8f/b13447d1bf0b1f7467ce7d86f6e6edf66c0ad7cf44cf5c87a37f9bed9936/pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542", size = 2423067, upload-time = "2025-07-01T09:14:33.709Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/93/0952f2ed8db3a5a4c7a11f91965d6184ebc8cd7cbb7941a260d5f018cd2d/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd", size = 2128328, upload-time = "2025-07-01T09:14:35.276Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/e8/100c3d114b1a0bf4042f27e0f87d2f25e857e838034e98ca98fe7b8c0a9c/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8", size = 2170652, upload-time = "2025-07-01T09:14:37.203Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/86/3f758a28a6e381758545f7cdb4942e1cb79abd271bea932998fc0db93cb6/pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f", size = 2227443, upload-time = "2025-07-01T09:14:39.344Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/f4/91d5b3ffa718df2f53b0dc109877993e511f4fd055d7e9508682e8aba092/pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c", size = 5278474, upload-time = "2025-07-01T09:14:41.843Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f9/0e/37d7d3eca6c879fbd9dba21268427dffda1ab00d4eb05b32923d4fbe3b12/pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd", size = 4686038, upload-time = "2025-07-01T09:14:44.008Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/b0/3426e5c7f6565e752d81221af9d3676fdbb4f352317ceafd42899aaf5d8a/pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e", size = 5864407, upload-time = "2025-07-03T13:10:15.628Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/c1/c6c423134229f2a221ee53f838d4be9d82bab86f7e2f8e75e47b6bf6cd77/pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1", size = 7639094, upload-time = "2025-07-03T13:10:21.857Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/c9/09e6746630fe6372c67c648ff9deae52a2bc20897d51fa293571977ceb5d/pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805", size = 5973503, upload-time = "2025-07-01T09:14:45.698Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/1c/a2a29649c0b1983d3ef57ee87a66487fdeb45132df66ab30dd37f7dbe162/pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8", size = 6642574, upload-time = "2025-07-01T09:14:47.415Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/de/d5cc31cc4b055b6c6fd990e3e7f0f8aaf36229a2698501bcb0cdf67c7146/pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2", size = 6084060, upload-time = "2025-07-01T09:14:49.636Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/ea/502d938cbaeec836ac28a9b730193716f0114c41325db428e6b280513f09/pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b", size = 6721407, upload-time = "2025-07-01T09:14:51.962Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/9c/9c5e2a73f125f6cbc59cc7087c8f2d649a7ae453f83bd0362ff7c9e2aee2/pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3", size = 6273841, upload-time = "2025-07-01T09:14:54.142Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/23/85/397c73524e0cd212067e0c969aa245b01d50183439550d24d9f55781b776/pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51", size = 6978450, upload-time = "2025-07-01T09:14:56.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/d2/622f4547f69cd173955194b78e4d19ca4935a1b0f03a302d655c9f6aae65/pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580", size = 2423055, upload-time = "2025-07-01T09:14:58.072Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/80/a8a2ac21dda2e82480852978416cfacd439a4b490a501a288ecf4fe2532d/pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e", size = 5281110, upload-time = "2025-07-01T09:14:59.79Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/d6/b79754ca790f315918732e18f82a8146d33bcd7f4494380457ea89eb883d/pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d", size = 4689547, upload-time = "2025-07-01T09:15:01.648Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/20/716b8717d331150cb00f7fdd78169c01e8e0c219732a78b0e59b6bdb2fd6/pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced", size = 5901554, upload-time = "2025-07-03T13:10:27.018Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/cf/a9f3a2514a65bb071075063a96f0a5cf949c2f2fce683c15ccc83b1c1cab/pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c", size = 7669132, upload-time = "2025-07-03T13:10:33.01Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/3c/da78805cbdbee9cb43efe8261dd7cc0b4b93f2ac79b676c03159e9db2187/pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8", size = 6005001, upload-time = "2025-07-01T09:15:03.365Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/fa/ce044b91faecf30e635321351bba32bab5a7e034c60187fe9698191aef4f/pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59", size = 6668814, upload-time = "2025-07-01T09:15:05.655Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/51/90f9291406d09bf93686434f9183aba27b831c10c87746ff49f127ee80cb/pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe", size = 6113124, upload-time = "2025-07-01T09:15:07.358Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/5a/6fec59b1dfb619234f7636d4157d11fb4e196caeee220232a8d2ec48488d/pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c", size = 6747186, upload-time = "2025-07-01T09:15:09.317Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/6b/00187a044f98255225f172de653941e61da37104a9ea60e4f6887717e2b5/pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788", size = 6277546, upload-time = "2025-07-01T09:15:11.311Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/5c/6caaba7e261c0d75bab23be79f1d06b5ad2a2ae49f028ccec801b0e853d6/pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31", size = 6985102, upload-time = "2025-07-01T09:15:13.164Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/7e/b623008460c09a0cb38263c93b828c666493caee2eb34ff67f778b87e58c/pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e", size = 2424803, upload-time = "2025-07-01T09:15:15.695Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/f4/04905af42837292ed86cb1b1dabe03dce1edc008ef14c473c5c7e1443c5d/pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12", size = 5278520, upload-time = "2025-07-01T09:15:17.429Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/b0/33d79e377a336247df6348a54e6d2a2b85d644ca202555e3faa0cf811ecc/pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a", size = 4686116, upload-time = "2025-07-01T09:15:19.423Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/2d/ed8bc0ab219ae8768f529597d9509d184fe8a6c4741a6864fea334d25f3f/pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632", size = 5864597, upload-time = "2025-07-03T13:10:38.404Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/3d/b932bb4225c80b58dfadaca9d42d08d0b7064d2d1791b6a237f87f661834/pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673", size = 7638246, upload-time = "2025-07-03T13:10:44.987Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/09/b5/0487044b7c096f1b48f0d7ad416472c02e0e4bf6919541b111efd3cae690/pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027", size = 5973336, upload-time = "2025-07-01T09:15:21.237Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/2d/524f9318f6cbfcc79fbc004801ea6b607ec3f843977652fdee4857a7568b/pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77", size = 6642699, upload-time = "2025-07-01T09:15:23.186Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/d2/a9a4f280c6aefedce1e8f615baaa5474e0701d86dd6f1dede66726462bbd/pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874", size = 6083789, upload-time = "2025-07-01T09:15:25.1Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fe/54/86b0cd9dbb683a9d5e960b66c7379e821a19be4ac5810e2e5a715c09a0c0/pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a", size = 6720386, upload-time = "2025-07-01T09:15:27.378Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/95/88efcaf384c3588e24259c4203b909cbe3e3c2d887af9e938c2022c9dd48/pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214", size = 6370911, upload-time = "2025-07-01T09:15:29.294Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/cc/934e5820850ec5eb107e7b1a72dd278140731c669f396110ebc326f2a503/pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635", size = 7117383, upload-time = "2025-07-01T09:15:31.128Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/e9/9c0a616a71da2a5d163aa37405e8aced9a906d574b4a214bede134e731bc/pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6", size = 2511385, upload-time = "2025-07-01T09:15:33.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/33/c88376898aff369658b225262cd4f2659b13e8178e7534df9e6e1fa289f6/pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae", size = 5281129, upload-time = "2025-07-01T09:15:35.194Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/70/d376247fb36f1844b42910911c83a02d5544ebd2a8bad9efcc0f707ea774/pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653", size = 4689580, upload-time = "2025-07-01T09:15:37.114Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/1c/537e930496149fbac69efd2fc4329035bbe2e5475b4165439e3be9cb183b/pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6", size = 5902860, upload-time = "2025-07-03T13:10:50.248Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/57/80f53264954dcefeebcf9dae6e3eb1daea1b488f0be8b8fef12f79a3eb10/pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36", size = 7670694, upload-time = "2025-07-03T13:10:56.432Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/ff/4727d3b71a8578b4587d9c276e90efad2d6fe0335fd76742a6da08132e8c/pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b", size = 6005888, upload-time = "2025-07-01T09:15:39.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/ae/716592277934f85d3be51d7256f3636672d7b1abfafdc42cf3f8cbd4b4c8/pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477", size = 6670330, upload-time = "2025-07-01T09:15:41.269Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/bb/7fe6cddcc8827b01b1a9766f5fdeb7418680744f9082035bdbabecf1d57f/pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50", size = 6114089, upload-time = "2025-07-01T09:15:43.13Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/f5/06bfaa444c8e80f1a8e4bff98da9c83b37b5be3b1deaa43d27a0db37ef84/pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b", size = 6748206, upload-time = "2025-07-01T09:15:44.937Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/77/bc6f92a3e8e6e46c0ca78abfffec0037845800ea38c73483760362804c41/pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12", size = 6377370, upload-time = "2025-07-01T09:15:46.673Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/82/3a721f7d69dca802befb8af08b7c79ebcab461007ce1c18bd91a5d5896f9/pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db", size = 7121500, upload-time = "2025-07-01T09:15:48.512Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -1297,15 +1312,15 @@ dependencies = [
|
|||
{ name = "typing-extensions", marker = "sys_platform == 'darwin'" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-1-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:7fbbf409143a4fe0812a40c0b46a436030a7e1d14fe8c5234dfbe44df47f617e" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-1-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:b39cafff7229699f9d6e172cac74d85fd71b568268e439e08d9c540e54732a3e" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-2-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:358bd7125cbec6e692d60618a5eec7f55a51b29e3652a849fd42af021d818023" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-2-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:470de4176007c2700735e003a830828a88d27129032a3add07291da07e2a94e8" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:45a1c5057629444aeb1c452c18298fa7f30f2f7aeadd4dc41f9d340980294407" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:339e05502b6c839db40e88720cb700f5a3b50cda332284873e851772d41b2c1e" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:840351da59cedb7bcbc51981880050813c19ef6b898a7fecf73a3afc71aff3fe" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:c88b1129fd4e14f0f882963c6728315caae35d2f47374d17edeed1edc7697497" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:f4bea7dc451267c028593751612ad559299589304e68df54ae7672427893ff2c" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-1-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:7fbbf409143a4fe0812a40c0b46a436030a7e1d14fe8c5234dfbe44df47f617e", upload-time = "2026-02-06T16:27:14Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-1-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:b39cafff7229699f9d6e172cac74d85fd71b568268e439e08d9c540e54732a3e", upload-time = "2026-02-06T16:27:17Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-2-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:358bd7125cbec6e692d60618a5eec7f55a51b29e3652a849fd42af021d818023", upload-time = "2026-02-10T19:55:42Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-2-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:470de4176007c2700735e003a830828a88d27129032a3add07291da07e2a94e8", upload-time = "2026-02-10T19:55:43Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:45a1c5057629444aeb1c452c18298fa7f30f2f7aeadd4dc41f9d340980294407", upload-time = "2026-01-23T15:09:55Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:339e05502b6c839db40e88720cb700f5a3b50cda332284873e851772d41b2c1e", upload-time = "2026-01-23T15:09:57Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:840351da59cedb7bcbc51981880050813c19ef6b898a7fecf73a3afc71aff3fe", upload-time = "2026-01-23T15:09:59Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:c88b1129fd4e14f0f882963c6728315caae35d2f47374d17edeed1edc7697497", upload-time = "2026-01-23T15:09:59Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:f4bea7dc451267c028593751612ad559299589304e68df54ae7672427893ff2c", upload-time = "2026-01-23T15:10:01Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -1325,33 +1340,33 @@ dependencies = [
|
|||
{ name = "typing-extensions", marker = "sys_platform != 'darwin'" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-linux_aarch64.whl", hash = "sha256:8de5a36371b775e2d4881ed12cc7f2de400b1ad3d728aa74a281f649f87c9b8c" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-linux_s390x.whl", hash = "sha256:9accc30b56cb6756d4a9d04fcb8ebc0bb68c7d55c1ed31a8657397d316d31596" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:179451716487f8cb09b56459667fa1f5c4c0946c1e75fbeae77cfc40a5768d87" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:ee40b8a4b4b2cf0670c6fd4f35a7ef23871af956fecb238fbf5da15a72650b1d" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-win_amd64.whl", hash = "sha256:21cb5436978ef47c823b7a813ff0f8c2892e266cfe0f1d944879b5fba81bf4e1" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-win_arm64.whl", hash = "sha256:3eaa727e6a73affa61564d86b9d03191df45c8650d0666bd3d57c8597ef61e78" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-linux_aarch64.whl", hash = "sha256:fd215f3d0f681905c5b56b0630a3d666900a37fcc3ca5b937f95275c66f9fd9c" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-linux_s390x.whl", hash = "sha256:170a0623108055be5199370335cf9b41ba6875b3cb6f086db4aee583331a4899" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:e51994492cdb76edce29da88de3672a3022f9ef0ffd90345436948d4992be2c7" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:8d316e5bf121f1eab1147e49ad0511a9d92e4c45cc357d1ab0bee440da71a095" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-win_amd64.whl", hash = "sha256:b719da5af01b59126ac13eefd6ba3dd12d002dc0e8e79b8b365e55267a8189d3" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-win_arm64.whl", hash = "sha256:b67d91326e4ed9eccbd6b7d84ed7ffa43f93103aa3f0b24145f3001f3b11b714" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-linux_aarch64.whl", hash = "sha256:5af75e5f49de21b0bdf7672bc27139bd285f9e8dbcabe2d617a2eb656514ac36" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-linux_s390x.whl", hash = "sha256:ba51ef01a510baf8fff576174f702c47e1aa54389a9f1fba323bb1a5003ff0bf" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:0fedcb1a77e8f2aaf7bfd21591bf6d1e0b207473268c9be16b17cb7783253969" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:106dd1930cb30a4a337366ba3f9b25318ebf940f51fd46f789281dd9e736bdc4" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-win_amd64.whl", hash = "sha256:eb1bde1ce198f05c8770017de27e001d404499cf552aaaa014569eff56ca25c0" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-linux_aarch64.whl", hash = "sha256:ea2bcc9d1fca66974a71d4bf9a502539283f35d61fcab5a799b4e120846f1e02" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-linux_s390x.whl", hash = "sha256:f8294fd2fc6dd8f4435a891a0122307a043b14b21f0dac1bca63c85bfb59e586" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-manylinux_2_28_aarch64.whl", hash = "sha256:a28fdbcfa2fbacffec81300f24dd1bed2b0ccfdbed107a823cff12bc1db070f6" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-manylinux_2_28_x86_64.whl", hash = "sha256:aada8afc068add586464b2a55adb7cc9091eec55caf5320447204741cb6a0604" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-win_amd64.whl", hash = "sha256:2adc71fe471e98a608723bfc837f7e1929885ebb912c693597711e139c1cda41" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-linux_aarch64.whl", hash = "sha256:9412bd37b70f5ebd1205242c4ba4cabae35a605947f2b30806d5c9b467936db9" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-linux_s390x.whl", hash = "sha256:e71c476517c33e7db69825a9ff46c7f47a723ec4dac5b2481cff4246d1c632be" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:23882f8d882460aca809882fc42f5e343bf07585274f929ced00177d1be1eb67" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:4fcd8b4cc2ae20f2b7749fb275349c55432393868778c2d50a08e81d5ee5591e" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-win_amd64.whl", hash = "sha256:ffc8da9a1341092d6a90cb5b1c1a33cd61abf0fb43f0cd88443c27fa372c26ae" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-linux_aarch64.whl", hash = "sha256:8de5a36371b775e2d4881ed12cc7f2de400b1ad3d728aa74a281f649f87c9b8c", upload-time = "2026-01-23T15:10:22Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-linux_s390x.whl", hash = "sha256:9accc30b56cb6756d4a9d04fcb8ebc0bb68c7d55c1ed31a8657397d316d31596", upload-time = "2026-01-23T15:10:24Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:179451716487f8cb09b56459667fa1f5c4c0946c1e75fbeae77cfc40a5768d87", upload-time = "2026-01-23T15:10:25Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:ee40b8a4b4b2cf0670c6fd4f35a7ef23871af956fecb238fbf5da15a72650b1d", upload-time = "2026-01-23T15:10:27Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-win_amd64.whl", hash = "sha256:21cb5436978ef47c823b7a813ff0f8c2892e266cfe0f1d944879b5fba81bf4e1", upload-time = "2026-01-23T15:10:30Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp312-cp312-win_arm64.whl", hash = "sha256:3eaa727e6a73affa61564d86b9d03191df45c8650d0666bd3d57c8597ef61e78", upload-time = "2026-01-23T15:10:31Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-linux_aarch64.whl", hash = "sha256:fd215f3d0f681905c5b56b0630a3d666900a37fcc3ca5b937f95275c66f9fd9c", upload-time = "2026-01-23T15:10:34Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-linux_s390x.whl", hash = "sha256:170a0623108055be5199370335cf9b41ba6875b3cb6f086db4aee583331a4899", upload-time = "2026-01-23T15:10:35Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:e51994492cdb76edce29da88de3672a3022f9ef0ffd90345436948d4992be2c7", upload-time = "2026-01-23T15:10:37Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:8d316e5bf121f1eab1147e49ad0511a9d92e4c45cc357d1ab0bee440da71a095", upload-time = "2026-01-23T15:10:38Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-win_amd64.whl", hash = "sha256:b719da5af01b59126ac13eefd6ba3dd12d002dc0e8e79b8b365e55267a8189d3", upload-time = "2026-01-23T15:10:41Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313-win_arm64.whl", hash = "sha256:b67d91326e4ed9eccbd6b7d84ed7ffa43f93103aa3f0b24145f3001f3b11b714", upload-time = "2026-01-23T15:10:42Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-linux_aarch64.whl", hash = "sha256:5af75e5f49de21b0bdf7672bc27139bd285f9e8dbcabe2d617a2eb656514ac36", upload-time = "2026-01-23T15:10:44Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-linux_s390x.whl", hash = "sha256:ba51ef01a510baf8fff576174f702c47e1aa54389a9f1fba323bb1a5003ff0bf", upload-time = "2026-01-23T15:10:48Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:0fedcb1a77e8f2aaf7bfd21591bf6d1e0b207473268c9be16b17cb7783253969", upload-time = "2026-01-23T15:10:48Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:106dd1930cb30a4a337366ba3f9b25318ebf940f51fd46f789281dd9e736bdc4", upload-time = "2026-01-23T15:10:50Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp313-cp313t-win_amd64.whl", hash = "sha256:eb1bde1ce198f05c8770017de27e001d404499cf552aaaa014569eff56ca25c0", upload-time = "2026-01-23T15:10:50Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-linux_aarch64.whl", hash = "sha256:ea2bcc9d1fca66974a71d4bf9a502539283f35d61fcab5a799b4e120846f1e02", upload-time = "2026-01-23T15:10:53Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-linux_s390x.whl", hash = "sha256:f8294fd2fc6dd8f4435a891a0122307a043b14b21f0dac1bca63c85bfb59e586", upload-time = "2026-01-23T15:10:55Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-manylinux_2_28_aarch64.whl", hash = "sha256:a28fdbcfa2fbacffec81300f24dd1bed2b0ccfdbed107a823cff12bc1db070f6", upload-time = "2026-01-23T15:10:56Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-manylinux_2_28_x86_64.whl", hash = "sha256:aada8afc068add586464b2a55adb7cc9091eec55caf5320447204741cb6a0604", upload-time = "2026-01-23T15:10:58Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314-win_amd64.whl", hash = "sha256:2adc71fe471e98a608723bfc837f7e1929885ebb912c693597711e139c1cda41", upload-time = "2026-01-23T15:11:01Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-linux_aarch64.whl", hash = "sha256:9412bd37b70f5ebd1205242c4ba4cabae35a605947f2b30806d5c9b467936db9", upload-time = "2026-01-23T15:11:03Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-linux_s390x.whl", hash = "sha256:e71c476517c33e7db69825a9ff46c7f47a723ec4dac5b2481cff4246d1c632be", upload-time = "2026-01-23T15:11:04Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:23882f8d882460aca809882fc42f5e343bf07585274f929ced00177d1be1eb67", upload-time = "2026-01-23T15:11:07Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:4fcd8b4cc2ae20f2b7749fb275349c55432393868778c2d50a08e81d5ee5591e", upload-time = "2026-01-23T15:11:07Z" },
|
||||
{ url = "https://download-r2.pytorch.org/whl/cpu/torch-2.10.0%2Bcpu-cp314-cp314t-win_amd64.whl", hash = "sha256:ffc8da9a1341092d6a90cb5b1c1a33cd61abf0fb43f0cd88443c27fa372c26ae", upload-time = "2026-01-23T15:11:10Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
|
|||
Loading…
Reference in New Issue