mirror of
https://github.com/sissbruecker/linkding.git
synced 2025-08-15 14:39:24 +02:00
Compare commits
14 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
e83d519cab | ||
![]() |
6355d8dff1 | ||
![]() |
227cfdb063 | ||
![]() |
2d4da099c7 | ||
![]() |
a9512b2333 | ||
![]() |
47e944e6c5 | ||
![]() |
6c7ce91d53 | ||
![]() |
87020de917 | ||
![]() |
a130daa0f0 | ||
![]() |
d7c68c2818 | ||
![]() |
1daad2c86c | ||
![]() |
251def2583 | ||
![]() |
560769f068 | ||
![]() |
dc9799cc53 |
@@ -1,34 +1,22 @@
|
|||||||
# Remove project files, data, tmp files, build files
|
# Ignore everything
|
||||||
/.env
|
*
|
||||||
/.idea
|
|
||||||
/data
|
|
||||||
/node_modules
|
|
||||||
/tmp
|
|
||||||
/docs
|
|
||||||
/static
|
|
||||||
/scripts
|
|
||||||
/build
|
|
||||||
/out
|
|
||||||
/.git
|
|
||||||
/.devcontainer
|
|
||||||
|
|
||||||
/.dockerignore
|
# Include files required for build or at runtime
|
||||||
/.gitignore
|
!/bookmarks
|
||||||
/.gitattributes
|
!/siteroot
|
||||||
/Dockerfile
|
|
||||||
/docker-compose.yml
|
|
||||||
/*.sh
|
|
||||||
/*.iml
|
|
||||||
/*.patch
|
|
||||||
/*.md
|
|
||||||
/*.js
|
|
||||||
/*.log
|
|
||||||
/*.pid
|
|
||||||
|
|
||||||
# Whitelist files needed in build or prod image
|
|
||||||
!/rollup.config.js
|
|
||||||
!/bootstrap.sh
|
|
||||||
!/background-tasks-wrapper.sh
|
!/background-tasks-wrapper.sh
|
||||||
|
!/bootstrap.sh
|
||||||
|
!/LICENSE.txt
|
||||||
|
!/manage.py
|
||||||
|
!/package.json
|
||||||
|
!/package-lock.json
|
||||||
|
!/requirements.prod.txt
|
||||||
|
!/requirements.txt
|
||||||
|
!/rollup.config.js
|
||||||
|
!/supervisord.conf
|
||||||
|
!/uwsgi.ini
|
||||||
|
!/version.txt
|
||||||
|
|
||||||
# Remove development settings
|
# Remove dev settings
|
||||||
/siteroot/settings/dev.py
|
/siteroot/settings/dev.py
|
||||||
|
57
CHANGELOG.md
57
CHANGELOG.md
@@ -1,5 +1,62 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## v1.23.0 (24/11/2023)
|
||||||
|
|
||||||
|
### What's Changed
|
||||||
|
* Add Alpine based Docker image (experimental) by @sissbruecker in https://github.com/sissbruecker/linkding/pull/570
|
||||||
|
* Add backup CLI command by @sissbruecker in https://github.com/sissbruecker/linkding/pull/571
|
||||||
|
* Update browser extension links by @OPerepadia in https://github.com/sissbruecker/linkding/pull/574
|
||||||
|
* Include archived bookmarks in export by @sissbruecker in https://github.com/sissbruecker/linkding/pull/579
|
||||||
|
|
||||||
|
### New Contributors
|
||||||
|
* @OPerepadia made their first contribution in https://github.com/sissbruecker/linkding/pull/574
|
||||||
|
|
||||||
|
**Full Changelog**: https://github.com/sissbruecker/linkding/compare/v1.22.3...v1.23.0
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v1.22.3 (04/11/2023)
|
||||||
|
|
||||||
|
### What's Changed
|
||||||
|
* Fix RSS feed not handling None values by @vitormarcal in https://github.com/sissbruecker/linkding/pull/569
|
||||||
|
* Bump django from 4.1.10 to 4.1.13 by @dependabot in https://github.com/sissbruecker/linkding/pull/567
|
||||||
|
|
||||||
|
### New Contributors
|
||||||
|
* @vitormarcal made their first contribution in https://github.com/sissbruecker/linkding/pull/569
|
||||||
|
|
||||||
|
**Full Changelog**: https://github.com/sissbruecker/linkding/compare/v1.22.2...v1.22.3
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v1.22.2 (27/10/2023)
|
||||||
|
|
||||||
|
### What's Changed
|
||||||
|
* Fix search options not opening on iOS by @sissbruecker in https://github.com/sissbruecker/linkding/pull/549
|
||||||
|
* Bump urllib3 from 1.26.11 to 1.26.17 by @dependabot in https://github.com/sissbruecker/linkding/pull/542
|
||||||
|
* Add iOS shortcut to community section by @andrewdolphin in https://github.com/sissbruecker/linkding/pull/550
|
||||||
|
* Disable editing of search preferences in user admin by @sissbruecker in https://github.com/sissbruecker/linkding/pull/555
|
||||||
|
* Add feed2linkding to community section by @Strubbl in https://github.com/sissbruecker/linkding/pull/544
|
||||||
|
* Sanitize RSS feed to remove control characters by @sissbruecker in https://github.com/sissbruecker/linkding/pull/565
|
||||||
|
* Bump urllib3 from 1.26.17 to 1.26.18 by @dependabot in https://github.com/sissbruecker/linkding/pull/560
|
||||||
|
|
||||||
|
### New Contributors
|
||||||
|
* @andrewdolphin made their first contribution in https://github.com/sissbruecker/linkding/pull/550
|
||||||
|
* @Strubbl made their first contribution in https://github.com/sissbruecker/linkding/pull/544
|
||||||
|
|
||||||
|
**Full Changelog**: https://github.com/sissbruecker/linkding/compare/v1.22.1...v1.22.2
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## v1.22.1 (06/10/2023)
|
||||||
|
|
||||||
|
### What's Changed
|
||||||
|
* Fix memory leak with SQLite by @sissbruecker in https://github.com/sissbruecker/linkding/pull/548
|
||||||
|
|
||||||
|
|
||||||
|
**Full Changelog**: https://github.com/sissbruecker/linkding/compare/v1.22.0...v1.22.1
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## v1.22.0 (01/10/2023)
|
## v1.22.0 (01/10/2023)
|
||||||
|
|
||||||
### What's Changed
|
### What's Changed
|
||||||
|
24
README.md
24
README.md
@@ -40,7 +40,7 @@ The name comes from:
|
|||||||
- Automatically provides titles, descriptions and icons of bookmarked websites
|
- Automatically provides titles, descriptions and icons of bookmarked websites
|
||||||
- Automatically creates snapshots of bookmarked websites on [the Internet Archive Wayback Machine](https://archive.org/web/)
|
- Automatically creates snapshots of bookmarked websites on [the Internet Archive Wayback Machine](https://archive.org/web/)
|
||||||
- Import and export bookmarks in Netscape HTML format
|
- Import and export bookmarks in Netscape HTML format
|
||||||
- Extensions for [Firefox](https://addons.mozilla.org/de/firefox/addon/linkding-extension/) and [Chrome](https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe), as well as a bookmarklet
|
- Extensions for [Firefox](https://addons.mozilla.org/firefox/addon/linkding-extension/) and [Chrome](https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe), as well as a bookmarklet
|
||||||
- Light and dark themes
|
- Light and dark themes
|
||||||
- REST API for developing 3rd party apps
|
- REST API for developing 3rd party apps
|
||||||
- Admin panel for user self-service and raw data access
|
- Admin panel for user self-service and raw data access
|
||||||
@@ -58,9 +58,27 @@ The name comes from:
|
|||||||
linkding is designed to be run with container solutions like [Docker](https://docs.docker.com/get-started/).
|
linkding is designed to be run with container solutions like [Docker](https://docs.docker.com/get-started/).
|
||||||
The Docker image is compatible with ARM platforms, so it can be run on a Raspberry Pi.
|
The Docker image is compatible with ARM platforms, so it can be run on a Raspberry Pi.
|
||||||
|
|
||||||
By default, linkding uses SQLite as a database.
|
linkding uses an SQLite database by default.
|
||||||
Alternatively linkding supports PostgreSQL, see the [database options](docs/Options.md#LD_DB_ENGINE) for more information.
|
Alternatively linkding supports PostgreSQL, see the [database options](docs/Options.md#LD_DB_ENGINE) for more information.
|
||||||
|
|
||||||
|
<details>
|
||||||
|
|
||||||
|
<summary>🧪 Alpine-based image</summary>
|
||||||
|
|
||||||
|
The default Docker image (`latest` tag) is based on a slim variant of Debian Linux.
|
||||||
|
Alternatively, there is an image based on Alpine Linux (`latest-alpine` tag) which has a smaller size, resulting in a smaller download and less disk space required.
|
||||||
|
The Alpine image is currently about 45 MB in compressed size, compared to about 130 MB for the Debian image.
|
||||||
|
|
||||||
|
To use it, replace the `latest` tag with `latest-alpine`, either in the CLI command below when using Docker, or in the `docker-compose.yml` file when using docker-compose.
|
||||||
|
|
||||||
|
> [!WARNING]
|
||||||
|
> The image is currently considered experimental in order to gather feedback and iron out any issues.
|
||||||
|
> Only use it if you are comfortable running experimental software or want to help out with testing.
|
||||||
|
> While there should be no issues with creating new installations, there might be issues when migrating existing installations.
|
||||||
|
> If you plan to migrate your existing installation, make sure to create proper [backups](https://github.com/sissbruecker/linkding/blob/master/docs/backup.md) first.
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
### Using Docker
|
### Using Docker
|
||||||
|
|
||||||
To install linkding using Docker you can just run the [latest image](https://hub.docker.com/repository/docker/sissbruecker/linkding) from Docker Hub:
|
To install linkding using Docker you can just run the [latest image](https://hub.docker.com/repository/docker/sissbruecker/linkding) from Docker Hub:
|
||||||
@@ -180,7 +198,7 @@ Self-hosting web applications still requires a lot of technical know-how and com
|
|||||||
## Browser Extension
|
## Browser Extension
|
||||||
|
|
||||||
linkding comes with an official browser extension that allows to quickly add bookmarks, and search bookmarks through the browser's address bar. You can get the extension here:
|
linkding comes with an official browser extension that allows to quickly add bookmarks, and search bookmarks through the browser's address bar. You can get the extension here:
|
||||||
- [Mozilla Addon Store](https://addons.mozilla.org/de/firefox/addon/linkding-extension/)
|
- [Mozilla Addon Store](https://addons.mozilla.org/firefox/addon/linkding-extension/)
|
||||||
- [Chrome Web Store](https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe)
|
- [Chrome Web Store](https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe)
|
||||||
|
|
||||||
The extension is open-source as well, and can be found [here](https://github.com/sissbruecker/linkding-extension).
|
The extension is open-source as well, and can be found [here](https://github.com/sissbruecker/linkding-extension).
|
||||||
|
@@ -16,6 +16,8 @@ class FeedContext:
|
|||||||
|
|
||||||
|
|
||||||
def sanitize(text: str):
|
def sanitize(text: str):
|
||||||
|
if not text:
|
||||||
|
return ''
|
||||||
# remove control characters
|
# remove control characters
|
||||||
valid_chars = ['\n', '\r', '\t']
|
valid_chars = ['\n', '\r', '\t']
|
||||||
return ''.join(ch for ch in text if ch in valid_chars or unicodedata.category(ch)[0] != 'C')
|
return ''.join(ch for ch in text if ch in valid_chars or unicodedata.category(ch)[0] != 'C')
|
||||||
|
26
bookmarks/management/commands/backup.py
Normal file
26
bookmarks/management/commands/backup.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Creates a backup of the linkding database"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument('destination', type=str, help='Backup file destination')
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
destination = options['destination']
|
||||||
|
|
||||||
|
def progress(status, remaining, total):
|
||||||
|
self.stdout.write(f'Copied {total-remaining} of {total} pages...')
|
||||||
|
|
||||||
|
source_db = sqlite3.connect(os.path.join('data', 'db.sqlite3'))
|
||||||
|
backup_db = sqlite3.connect(destination)
|
||||||
|
with backup_db:
|
||||||
|
source_db.backup(backup_db, pages=50, progress=progress)
|
||||||
|
backup_db.close()
|
||||||
|
source_db.close()
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS(f'Backup created at {destination}'))
|
@@ -33,7 +33,10 @@ def append_bookmark(doc: BookmarkDocument, bookmark: Bookmark):
|
|||||||
desc = html.escape(bookmark.resolved_description or '')
|
desc = html.escape(bookmark.resolved_description or '')
|
||||||
if bookmark.notes:
|
if bookmark.notes:
|
||||||
desc += f'[linkding-notes]{html.escape(bookmark.notes)}[/linkding-notes]'
|
desc += f'[linkding-notes]{html.escape(bookmark.notes)}[/linkding-notes]'
|
||||||
tags = ','.join(bookmark.tag_names)
|
tag_names = bookmark.tag_names
|
||||||
|
if bookmark.is_archived:
|
||||||
|
tag_names.append('linkding:archived')
|
||||||
|
tags = ','.join(tag_names)
|
||||||
toread = '1' if bookmark.unread else '0'
|
toread = '1' if bookmark.unread else '0'
|
||||||
private = '0' if bookmark.shared else '1'
|
private = '0' if bookmark.shared else '1'
|
||||||
added = int(bookmark.date_added.timestamp())
|
added = int(bookmark.date_added.timestamp())
|
||||||
|
@@ -5,7 +5,7 @@ from typing import List
|
|||||||
from django.contrib.auth.models import User
|
from django.contrib.auth.models import User
|
||||||
from django.utils import timezone
|
from django.utils import timezone
|
||||||
|
|
||||||
from bookmarks.models import Bookmark, Tag, parse_tag_string
|
from bookmarks.models import Bookmark, Tag
|
||||||
from bookmarks.services import tasks
|
from bookmarks.services import tasks
|
||||||
from bookmarks.services.parser import parse, NetscapeBookmark
|
from bookmarks.services.parser import parse, NetscapeBookmark
|
||||||
from bookmarks.utils import parse_timestamp
|
from bookmarks.utils import parse_timestamp
|
||||||
@@ -93,8 +93,7 @@ def _create_missing_tags(netscape_bookmarks: List[NetscapeBookmark], user: User)
|
|||||||
tags_to_create = []
|
tags_to_create = []
|
||||||
|
|
||||||
for netscape_bookmark in netscape_bookmarks:
|
for netscape_bookmark in netscape_bookmarks:
|
||||||
tag_names = parse_tag_string(netscape_bookmark.tag_string)
|
for tag_name in netscape_bookmark.tag_names:
|
||||||
for tag_name in tag_names:
|
|
||||||
tag = tag_cache.get(tag_name)
|
tag = tag_cache.get(tag_name)
|
||||||
if not tag:
|
if not tag:
|
||||||
tag = Tag(name=tag_name, owner=user)
|
tag = Tag(name=tag_name, owner=user)
|
||||||
@@ -194,8 +193,7 @@ def _import_batch(netscape_bookmarks: List[NetscapeBookmark],
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
# Get tag models by string, schedule inserts for bookmark -> tag associations
|
# Get tag models by string, schedule inserts for bookmark -> tag associations
|
||||||
tag_names = parse_tag_string(netscape_bookmark.tag_string)
|
tags = tag_cache.get_all(netscape_bookmark.tag_names)
|
||||||
tags = tag_cache.get_all(tag_names)
|
|
||||||
for tag in tags:
|
for tag in tags:
|
||||||
relationships.append(BookmarkToTagRelationShip(bookmark=bookmark, tag=tag))
|
relationships.append(BookmarkToTagRelationShip(bookmark=bookmark, tag=tag))
|
||||||
|
|
||||||
@@ -219,3 +217,5 @@ def _copy_bookmark_data(netscape_bookmark: NetscapeBookmark, bookmark: Bookmark,
|
|||||||
bookmark.notes = netscape_bookmark.notes
|
bookmark.notes = netscape_bookmark.notes
|
||||||
if options.map_private_flag and not netscape_bookmark.private:
|
if options.map_private_flag and not netscape_bookmark.private:
|
||||||
bookmark.shared = True
|
bookmark.shared = True
|
||||||
|
if netscape_bookmark.archived:
|
||||||
|
bookmark.is_archived = True
|
||||||
|
@@ -2,6 +2,8 @@ from dataclasses import dataclass
|
|||||||
from html.parser import HTMLParser
|
from html.parser import HTMLParser
|
||||||
from typing import Dict, List
|
from typing import Dict, List
|
||||||
|
|
||||||
|
from bookmarks.models import parse_tag_string
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class NetscapeBookmark:
|
class NetscapeBookmark:
|
||||||
@@ -10,9 +12,10 @@ class NetscapeBookmark:
|
|||||||
description: str
|
description: str
|
||||||
notes: str
|
notes: str
|
||||||
date_added: str
|
date_added: str
|
||||||
tag_string: str
|
tag_names: List[str]
|
||||||
to_read: bool
|
to_read: bool
|
||||||
private: bool
|
private: bool
|
||||||
|
archived: bool
|
||||||
|
|
||||||
|
|
||||||
class BookmarkParser(HTMLParser):
|
class BookmarkParser(HTMLParser):
|
||||||
@@ -56,16 +59,24 @@ class BookmarkParser(HTMLParser):
|
|||||||
|
|
||||||
def handle_start_a(self, attrs: Dict[str, str]):
|
def handle_start_a(self, attrs: Dict[str, str]):
|
||||||
vars(self).update(attrs)
|
vars(self).update(attrs)
|
||||||
|
tag_names = parse_tag_string(self.tags)
|
||||||
|
archived = 'linkding:archived' in self.tags
|
||||||
|
try:
|
||||||
|
tag_names.remove('linkding:archived')
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
||||||
self.bookmark = NetscapeBookmark(
|
self.bookmark = NetscapeBookmark(
|
||||||
href=self.href,
|
href=self.href,
|
||||||
title='',
|
title='',
|
||||||
description='',
|
description='',
|
||||||
notes='',
|
notes='',
|
||||||
date_added=self.add_date,
|
date_added=self.add_date,
|
||||||
tag_string=self.tags,
|
tag_names=tag_names,
|
||||||
to_read=self.toread == '1',
|
to_read=self.toread == '1',
|
||||||
# Mark as private by default, also when attribute is not specified
|
# Mark as private by default, also when attribute is not specified
|
||||||
private=self.private != '0',
|
private=self.private != '0',
|
||||||
|
archived=archived,
|
||||||
)
|
)
|
||||||
|
|
||||||
def handle_a_data(self, data):
|
def handle_a_data(self, data):
|
||||||
|
@@ -95,7 +95,7 @@
|
|||||||
props: {
|
props: {
|
||||||
name: 'q',
|
name: 'q',
|
||||||
placeholder: 'Search for words or #tags',
|
placeholder: 'Search for words or #tags',
|
||||||
value: '{{ search.q|safe }}',
|
value: input.value,
|
||||||
tags: uniqueTags,
|
tags: uniqueTags,
|
||||||
mode: '{{ mode }}',
|
mode: '{{ mode }}',
|
||||||
linkTarget: '{{ request.user_profile.bookmark_link_target }}',
|
linkTarget: '{{ request.user_profile.bookmark_link_target }}',
|
||||||
|
@@ -9,7 +9,7 @@
|
|||||||
<h2>Browser Extension</h2>
|
<h2>Browser Extension</h2>
|
||||||
<p>The browser extension allows you to quickly add new bookmarks without leaving the page that you are on. The extension is available in the official extension stores for:</p>
|
<p>The browser extension allows you to quickly add new bookmarks without leaving the page that you are on. The extension is available in the official extension stores for:</p>
|
||||||
<ul>
|
<ul>
|
||||||
<li><a href="https://addons.mozilla.org/de/firefox/addon/linkding-extension/" target="_blank">Firefox</a></li>
|
<li><a href="https://addons.mozilla.org/firefox/addon/linkding-extension/" target="_blank">Firefox</a></li>
|
||||||
<li><a href="https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe" target="_blank">Chrome</a></li>
|
<li><a href="https://chrome.google.com/webstore/detail/linkding-extension/beakmhbijpdhipnjhnclmhgjlddhidpe" target="_blank">Chrome</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
<p>The extension is <a href="https://github.com/sissbruecker/linkding-extension" target="_blank">open source</a> as well, which enables you to build and manually load it into any browser that supports Chrome extensions.</p>
|
<p>The extension is <a href="https://github.com/sissbruecker/linkding-extension" target="_blank">open source</a> as well, which enables you to build and manually load it into any browser that supports Chrome extensions.</p>
|
||||||
|
@@ -422,3 +422,31 @@ class BookmarkArchivedViewTestCase(TestCase, BookmarkFactoryMixin, HtmlTestMixin
|
|||||||
|
|
||||||
self.assertEqual(actions_form.attrs['action'],
|
self.assertEqual(actions_form.attrs['action'],
|
||||||
'/bookmarks/archived/action?q=%23foo&return_url=%2Fbookmarks%2Farchived%3Fq%3D%2523foo')
|
'/bookmarks/archived/action?q=%23foo&return_url=%2Fbookmarks%2Farchived%3Fq%3D%2523foo')
|
||||||
|
|
||||||
|
def test_encode_search_params(self):
|
||||||
|
bookmark = self.setup_bookmark(description='alert(\'xss\')', is_archived=True)
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?q=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
self.assertContains(response, bookmark.url)
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?sort=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?unread=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?shared=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?user=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:archived') + '?page=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
@@ -418,3 +418,31 @@ class BookmarkIndexViewTestCase(TestCase, BookmarkFactoryMixin, HtmlTestMixin):
|
|||||||
|
|
||||||
self.assertEqual(actions_form.attrs['action'],
|
self.assertEqual(actions_form.attrs['action'],
|
||||||
'/bookmarks/action?q=%23foo&return_url=%2Fbookmarks%3Fq%3D%2523foo')
|
'/bookmarks/action?q=%23foo&return_url=%2Fbookmarks%3Fq%3D%2523foo')
|
||||||
|
|
||||||
|
def test_encode_search_params(self):
|
||||||
|
bookmark = self.setup_bookmark(description='alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?q=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
self.assertContains(response, bookmark.url)
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?sort=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?unread=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?shared=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?user=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:index') + '?page=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
@@ -500,3 +500,35 @@ class BookmarkSharedViewTestCase(TestCase, BookmarkFactoryMixin, HtmlTestMixin):
|
|||||||
|
|
||||||
self.assertEqual(actions_form.attrs['action'],
|
self.assertEqual(actions_form.attrs['action'],
|
||||||
'/bookmarks/shared/action?q=%23foo&return_url=%2Fbookmarks%2Fshared%3Fq%3D%2523foo')
|
'/bookmarks/shared/action?q=%23foo&return_url=%2Fbookmarks%2Fshared%3Fq%3D%2523foo')
|
||||||
|
|
||||||
|
def test_encode_search_params(self):
|
||||||
|
self.authenticate()
|
||||||
|
user = self.get_or_create_test_user()
|
||||||
|
user.profile.enable_sharing = True
|
||||||
|
user.profile.save()
|
||||||
|
bookmark = self.setup_bookmark(description='alert(\'xss\')', shared=True)
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?q=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
self.assertContains(response, bookmark.url)
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?sort=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?unread=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?shared=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?user=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
|
||||||
|
url = reverse('bookmarks:shared') + '?page=alert(%27xss%27)'
|
||||||
|
response = self.client.get(url)
|
||||||
|
self.assertNotContains(response, 'alert(\'xss\')')
|
||||||
|
@@ -22,6 +22,9 @@ class ExporterTestCase(TestCase, BookmarkFactoryMixin):
|
|||||||
description='Example description', notes='Example notes'),
|
description='Example description', notes='Example notes'),
|
||||||
self.setup_bookmark(url='https://example.com/6', title='Title 6', added=added, shared=True,
|
self.setup_bookmark(url='https://example.com/6', title='Title 6', added=added, shared=True,
|
||||||
notes='Example notes'),
|
notes='Example notes'),
|
||||||
|
self.setup_bookmark(url='https://example.com/7', title='Title 7', added=added, is_archived=True),
|
||||||
|
self.setup_bookmark(url='https://example.com/8', title='Title 8', added=added,
|
||||||
|
tags=[self.setup_tag(name='tag4'), self.setup_tag(name='tag5')], is_archived=True),
|
||||||
]
|
]
|
||||||
html = exporter.export_netscape_html(bookmarks)
|
html = exporter.export_netscape_html(bookmarks)
|
||||||
|
|
||||||
@@ -35,6 +38,8 @@ class ExporterTestCase(TestCase, BookmarkFactoryMixin):
|
|||||||
'<DD>Example description[linkding-notes]Example notes[/linkding-notes]',
|
'<DD>Example description[linkding-notes]Example notes[/linkding-notes]',
|
||||||
f'<DT><A HREF="https://example.com/6" ADD_DATE="{timestamp}" PRIVATE="0" TOREAD="0" TAGS="">Title 6</A>',
|
f'<DT><A HREF="https://example.com/6" ADD_DATE="{timestamp}" PRIVATE="0" TOREAD="0" TAGS="">Title 6</A>',
|
||||||
'<DD>[linkding-notes]Example notes[/linkding-notes]',
|
'<DD>[linkding-notes]Example notes[/linkding-notes]',
|
||||||
|
f'<DT><A HREF="https://example.com/7" ADD_DATE="{timestamp}" PRIVATE="1" TOREAD="0" TAGS="linkding:archived">Title 7</A>',
|
||||||
|
f'<DT><A HREF="https://example.com/8" ADD_DATE="{timestamp}" PRIVATE="1" TOREAD="0" TAGS="tag4,tag5,linkding:archived">Title 8</A>',
|
||||||
]
|
]
|
||||||
self.assertIn('\n\r'.join(lines), html)
|
self.assertIn('\n\r'.join(lines), html)
|
||||||
|
|
||||||
|
@@ -7,6 +7,8 @@ from django.urls import reverse
|
|||||||
|
|
||||||
from bookmarks.tests.helpers import BookmarkFactoryMixin
|
from bookmarks.tests.helpers import BookmarkFactoryMixin
|
||||||
from bookmarks.models import FeedToken, User
|
from bookmarks.models import FeedToken, User
|
||||||
|
from bookmarks.feeds import sanitize
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def rfc2822_date(date):
|
def rfc2822_date(date):
|
||||||
@@ -112,6 +114,9 @@ class FeedsTestCase(TestCase, BookmarkFactoryMixin):
|
|||||||
self.assertContains(response, f'<title>test\n\r\ttitle</title>', count=1)
|
self.assertContains(response, f'<title>test\n\r\ttitle</title>', count=1)
|
||||||
self.assertContains(response, f'<description>test\n\r\tdescription</description>', count=1)
|
self.assertContains(response, f'<description>test\n\r\tdescription</description>', count=1)
|
||||||
|
|
||||||
|
def test_sanitize_with_none_text(self):
|
||||||
|
self.assertEqual('', sanitize(None))
|
||||||
|
|
||||||
def test_unread_returns_404_for_unknown_feed_token(self):
|
def test_unread_returns_404_for_unknown_feed_token(self):
|
||||||
response = self.client.get(reverse('bookmarks:feeds.unread', args=['foo']))
|
response = self.client.get(reverse('bookmarks:feeds.unread', args=['foo']))
|
||||||
|
|
||||||
|
@@ -295,6 +295,27 @@ class ImporterTestCase(TestCase, BookmarkFactoryMixin, ImportTestMixin):
|
|||||||
self.assertEqual(bookmark2.shared, False)
|
self.assertEqual(bookmark2.shared, False)
|
||||||
self.assertEqual(bookmark3.shared, True)
|
self.assertEqual(bookmark3.shared, True)
|
||||||
|
|
||||||
|
def test_archived_state(self):
|
||||||
|
test_html = self.render_html(tags_html='''
|
||||||
|
<DT><A HREF="https://example.com/1" ADD_DATE="1" TAGS="tag1,tag2,linkding:archived">Example title 1</A>
|
||||||
|
<DD>Example description 1</DD>
|
||||||
|
<DT><A HREF="https://example.com/2" ADD_DATE="1" PRIVATE="1" TAGS="tag1,tag2">Example title 2</A>
|
||||||
|
<DD>Example description 2</DD>
|
||||||
|
<DT><A HREF="https://example.com/3" ADD_DATE="1" PRIVATE="0">Example title 3</A>
|
||||||
|
<DD>Example description 3</DD>
|
||||||
|
''')
|
||||||
|
import_netscape_html(test_html, self.get_or_create_test_user(), ImportOptions())
|
||||||
|
|
||||||
|
self.assertEqual(Bookmark.objects.count(), 3)
|
||||||
|
self.assertEqual(Bookmark.objects.all()[0].is_archived, True)
|
||||||
|
self.assertEqual(Bookmark.objects.all()[1].is_archived, False)
|
||||||
|
self.assertEqual(Bookmark.objects.all()[2].is_archived, False)
|
||||||
|
|
||||||
|
tags = Tag.objects.all()
|
||||||
|
self.assertEqual(len(tags), 2)
|
||||||
|
self.assertEqual(tags[0].name, 'tag1')
|
||||||
|
self.assertEqual(tags[1].name, 'tag2')
|
||||||
|
|
||||||
def test_notes(self):
|
def test_notes(self):
|
||||||
# initial notes
|
# initial notes
|
||||||
test_html = self.render_html(tags_html='''
|
test_html = self.render_html(tags_html='''
|
||||||
|
@@ -2,6 +2,7 @@ from typing import List
|
|||||||
|
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
|
|
||||||
|
from bookmarks.models import parse_tag_string
|
||||||
from bookmarks.services.parser import NetscapeBookmark
|
from bookmarks.services.parser import NetscapeBookmark
|
||||||
from bookmarks.services.parser import parse
|
from bookmarks.services.parser import parse
|
||||||
from bookmarks.tests.helpers import ImportTestMixin, BookmarkHtmlTag
|
from bookmarks.tests.helpers import ImportTestMixin, BookmarkHtmlTag
|
||||||
@@ -16,7 +17,7 @@ class ParserTestCase(TestCase, ImportTestMixin):
|
|||||||
self.assertEqual(bookmark.title, html_tag.title)
|
self.assertEqual(bookmark.title, html_tag.title)
|
||||||
self.assertEqual(bookmark.date_added, html_tag.add_date)
|
self.assertEqual(bookmark.date_added, html_tag.add_date)
|
||||||
self.assertEqual(bookmark.description, html_tag.description)
|
self.assertEqual(bookmark.description, html_tag.description)
|
||||||
self.assertEqual(bookmark.tag_string, html_tag.tags)
|
self.assertEqual(bookmark.tag_names, parse_tag_string(html_tag.tags))
|
||||||
self.assertEqual(bookmark.to_read, html_tag.to_read)
|
self.assertEqual(bookmark.to_read, html_tag.to_read)
|
||||||
self.assertEqual(bookmark.private, html_tag.private)
|
self.assertEqual(bookmark.private, html_tag.private)
|
||||||
|
|
||||||
|
@@ -3,6 +3,7 @@ from unittest.mock import patch
|
|||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
|
|
||||||
|
from bookmarks.models import Bookmark
|
||||||
from bookmarks.tests.helpers import BookmarkFactoryMixin
|
from bookmarks.tests.helpers import BookmarkFactoryMixin
|
||||||
|
|
||||||
|
|
||||||
@@ -20,6 +21,9 @@ class SettingsExportViewTestCase(TestCase, BookmarkFactoryMixin):
|
|||||||
self.setup_bookmark(tags=[self.setup_tag()])
|
self.setup_bookmark(tags=[self.setup_tag()])
|
||||||
self.setup_bookmark(tags=[self.setup_tag()])
|
self.setup_bookmark(tags=[self.setup_tag()])
|
||||||
self.setup_bookmark(tags=[self.setup_tag()])
|
self.setup_bookmark(tags=[self.setup_tag()])
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], is_archived=True)
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], is_archived=True)
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], is_archived=True)
|
||||||
|
|
||||||
response = self.client.get(
|
response = self.client.get(
|
||||||
reverse('bookmarks:settings.export'),
|
reverse('bookmarks:settings.export'),
|
||||||
@@ -30,6 +34,35 @@ class SettingsExportViewTestCase(TestCase, BookmarkFactoryMixin):
|
|||||||
self.assertEqual(response['content-type'], 'text/plain; charset=UTF-8')
|
self.assertEqual(response['content-type'], 'text/plain; charset=UTF-8')
|
||||||
self.assertEqual(response['Content-Disposition'], 'attachment; filename="bookmarks.html"')
|
self.assertEqual(response['Content-Disposition'], 'attachment; filename="bookmarks.html"')
|
||||||
|
|
||||||
|
for bookmark in Bookmark.objects.all():
|
||||||
|
self.assertContains(response, bookmark.url)
|
||||||
|
|
||||||
|
def test_should_only_export_user_bookmarks(self):
|
||||||
|
other_user = self.setup_user()
|
||||||
|
owned_bookmarks = [
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()]),
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()]),
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()]),
|
||||||
|
]
|
||||||
|
non_owned_bookmarks = [
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], user=other_user),
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], user=other_user),
|
||||||
|
self.setup_bookmark(tags=[self.setup_tag()], user=other_user),
|
||||||
|
]
|
||||||
|
|
||||||
|
response = self.client.get(
|
||||||
|
reverse('bookmarks:settings.export'),
|
||||||
|
follow=True
|
||||||
|
)
|
||||||
|
|
||||||
|
text = response.content.decode('utf-8')
|
||||||
|
|
||||||
|
for bookmark in owned_bookmarks:
|
||||||
|
self.assertIn(bookmark.url, text)
|
||||||
|
|
||||||
|
for bookmark in non_owned_bookmarks:
|
||||||
|
self.assertNotIn(bookmark.url, text)
|
||||||
|
|
||||||
def test_should_check_authentication(self):
|
def test_should_check_authentication(self):
|
||||||
self.client.logout()
|
self.client.logout()
|
||||||
response = self.client.get(reverse('bookmarks:settings.export'), follow=True)
|
response = self.client.get(reverse('bookmarks:settings.export'), follow=True)
|
||||||
|
@@ -12,8 +12,7 @@ from django.shortcuts import render
|
|||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
from rest_framework.authtoken.models import Token
|
from rest_framework.authtoken.models import Token
|
||||||
|
|
||||||
from bookmarks.models import BookmarkSearch, UserProfileForm, FeedToken
|
from bookmarks.models import Bookmark, BookmarkSearch, UserProfileForm, FeedToken
|
||||||
from bookmarks.queries import query_bookmarks
|
|
||||||
from bookmarks.services import exporter, tasks
|
from bookmarks.services import exporter, tasks
|
||||||
from bookmarks.services import importer
|
from bookmarks.services import importer
|
||||||
from bookmarks.utils import app_version
|
from bookmarks.utils import app_version
|
||||||
@@ -136,7 +135,7 @@ def bookmark_import(request):
|
|||||||
def bookmark_export(request):
|
def bookmark_export(request):
|
||||||
# noinspection PyBroadException
|
# noinspection PyBroadException
|
||||||
try:
|
try:
|
||||||
bookmarks = list(query_bookmarks(request.user, request.user_profile, BookmarkSearch()))
|
bookmarks = Bookmark.objects.filter(owner=request.user)
|
||||||
# Prefetch tags to prevent n+1 queries
|
# Prefetch tags to prevent n+1 queries
|
||||||
prefetch_related_objects(bookmarks, 'tags')
|
prefetch_related_objects(bookmarks, 'tags')
|
||||||
file_content = exporter.export_netscape_html(bookmarks)
|
file_content = exporter.export_netscape_html(bookmarks)
|
||||||
|
89
docker/alpine.Dockerfile
Normal file
89
docker/alpine.Dockerfile
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
FROM node:18.18.0-alpine AS node-build
|
||||||
|
WORKDIR /etc/linkding
|
||||||
|
# install build dependencies
|
||||||
|
COPY rollup.config.js package.json package-lock.json ./
|
||||||
|
RUN npm install
|
||||||
|
# copy files needed for JS build
|
||||||
|
COPY bookmarks/frontend ./bookmarks/frontend
|
||||||
|
# run build
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
|
||||||
|
FROM python:3.10.13-alpine3.18 AS python-base
|
||||||
|
RUN apk update && apk add alpine-sdk linux-headers libpq-dev pkgconfig icu-dev sqlite-dev
|
||||||
|
WORKDIR /etc/linkding
|
||||||
|
|
||||||
|
|
||||||
|
FROM python-base AS python-build
|
||||||
|
# install build dependencies
|
||||||
|
COPY requirements.txt requirements.txt
|
||||||
|
# remove playwright from requirements as there is not always a distro and it's not needed for the build
|
||||||
|
RUN sed -i '/playwright/d' requirements.txt
|
||||||
|
RUN pip install -U pip && pip install -Ur requirements.txt
|
||||||
|
# copy files needed for Django build
|
||||||
|
COPY . .
|
||||||
|
COPY --from=node-build /etc/linkding .
|
||||||
|
# run Django part of the build
|
||||||
|
RUN python manage.py compilescss && \
|
||||||
|
python manage.py collectstatic --ignore=*.scss && \
|
||||||
|
python manage.py compilescss --delete-files
|
||||||
|
|
||||||
|
|
||||||
|
FROM python-base AS prod-deps
|
||||||
|
COPY requirements.prod.txt ./requirements.txt
|
||||||
|
RUN mkdir /opt/venv && \
|
||||||
|
python -m venv --upgrade-deps --copies /opt/venv && \
|
||||||
|
/opt/venv/bin/pip install --upgrade pip wheel && \
|
||||||
|
/opt/venv/bin/pip install -Ur requirements.txt
|
||||||
|
|
||||||
|
|
||||||
|
FROM python-base AS compile-icu
|
||||||
|
# Defines SQLite version
|
||||||
|
# Since this is only needed for downloading the header files this probably
|
||||||
|
# doesn't need to be up-to-date, assuming the SQLite APIs used by the ICU
|
||||||
|
# extension do not change
|
||||||
|
ARG SQLITE_RELEASE_YEAR=2023
|
||||||
|
ARG SQLITE_RELEASE=3430000
|
||||||
|
|
||||||
|
# Compile the ICU extension needed for case-insensitive search and ordering
|
||||||
|
# with SQLite. This does:
|
||||||
|
# - Download SQLite amalgamation for header files
|
||||||
|
# - Download ICU extension source file
|
||||||
|
# - Compile ICU extension
|
||||||
|
RUN wget https://www.sqlite.org/${SQLITE_RELEASE_YEAR}/sqlite-amalgamation-${SQLITE_RELEASE}.zip && \
|
||||||
|
unzip sqlite-amalgamation-${SQLITE_RELEASE}.zip && \
|
||||||
|
cp sqlite-amalgamation-${SQLITE_RELEASE}/sqlite3.h ./sqlite3.h && \
|
||||||
|
cp sqlite-amalgamation-${SQLITE_RELEASE}/sqlite3ext.h ./sqlite3ext.h && \
|
||||||
|
wget https://www.sqlite.org/src/raw/ext/icu/icu.c?name=91c021c7e3e8bbba286960810fa303295c622e323567b2e6def4ce58e4466e60 -O icu.c && \
|
||||||
|
gcc -fPIC -shared icu.c `pkg-config --libs --cflags icu-uc icu-io` -o libicu.so
|
||||||
|
|
||||||
|
|
||||||
|
FROM python:3.10.13-alpine3.18 AS final
|
||||||
|
# install runtime dependencies
|
||||||
|
RUN apk update && apk add bash curl icu libpq mailcap
|
||||||
|
# create www-data user and group
|
||||||
|
RUN set -x ; \
|
||||||
|
addgroup -g 82 -S www-data ; \
|
||||||
|
adduser -u 82 -D -S -G www-data www-data && exit 0 ; exit 1
|
||||||
|
WORKDIR /etc/linkding
|
||||||
|
# copy prod dependencies
|
||||||
|
COPY --from=prod-deps /opt/venv /opt/venv
|
||||||
|
# copy output from build stage
|
||||||
|
COPY --from=python-build /etc/linkding/static static/
|
||||||
|
# copy compiled icu extension
|
||||||
|
COPY --from=compile-icu /etc/linkding/libicu.so libicu.so
|
||||||
|
# copy application code
|
||||||
|
COPY . .
|
||||||
|
# Expose uwsgi server at port 9090
|
||||||
|
EXPOSE 9090
|
||||||
|
# Activate virtual env
|
||||||
|
ENV VIRTUAL_ENV /opt/venv
|
||||||
|
ENV PATH /opt/venv/bin:$PATH
|
||||||
|
# Allow running containers as an an arbitrary user in the root group, to support deployment scenarios like OpenShift, Podman
|
||||||
|
RUN chmod g+w . && \
|
||||||
|
chmod +x ./bootstrap.sh
|
||||||
|
|
||||||
|
HEALTHCHECK --interval=30s --retries=3 --timeout=1s \
|
||||||
|
CMD curl -f http://localhost:${LD_SERVER_PORT:-9090}/${LD_CONTEXT_PATH}health || exit 1
|
||||||
|
|
||||||
|
CMD ["./bootstrap.sh"]
|
@@ -1,11 +1,11 @@
|
|||||||
FROM node:18.18.0-alpine AS node-build
|
FROM node:18.18.0-alpine AS node-build
|
||||||
WORKDIR /etc/linkding
|
WORKDIR /etc/linkding
|
||||||
# install build dependencies
|
# install build dependencies
|
||||||
COPY package.json package-lock.json ./
|
COPY rollup.config.js package.json package-lock.json ./
|
||||||
RUN npm install -g npm && \
|
RUN npm install
|
||||||
npm install
|
# copy files needed for JS build
|
||||||
# compile JS components
|
COPY bookmarks/frontend ./bookmarks/frontend
|
||||||
COPY . .
|
# run build
|
||||||
RUN npm run build
|
RUN npm run build
|
||||||
|
|
||||||
|
|
||||||
@@ -17,9 +17,13 @@ WORKDIR /etc/linkding
|
|||||||
FROM python-base AS python-build
|
FROM python-base AS python-build
|
||||||
# install build dependencies
|
# install build dependencies
|
||||||
COPY requirements.txt requirements.txt
|
COPY requirements.txt requirements.txt
|
||||||
|
# remove playwright from requirements as there is not always a distro and it's not needed for the build
|
||||||
|
RUN sed -i '/playwright/d' requirements.txt
|
||||||
RUN pip install -U pip && pip install -Ur requirements.txt
|
RUN pip install -U pip && pip install -Ur requirements.txt
|
||||||
# run Django part of the build
|
# copy files needed for Django build
|
||||||
|
COPY . .
|
||||||
COPY --from=node-build /etc/linkding .
|
COPY --from=node-build /etc/linkding .
|
||||||
|
# run Django part of the build
|
||||||
RUN python manage.py compilescss && \
|
RUN python manage.py compilescss && \
|
||||||
python manage.py collectstatic --ignore=*.scss && \
|
python manage.py collectstatic --ignore=*.scss && \
|
||||||
python manage.py compilescss --delete-files
|
python manage.py compilescss --delete-files
|
@@ -1,52 +1,82 @@
|
|||||||
# Backups
|
# Backups
|
||||||
|
|
||||||
This page describes some options on how to create backups.
|
Linkding stores all data in the application's data folder.
|
||||||
|
The full path to that folder in the Docker container is `/etc/linkding/data`.
|
||||||
|
As described in the installation docs, you should mount the `/etc/linkding/data` folder to a folder on your host system.
|
||||||
|
|
||||||
## What to backup
|
The data folder contains the following contents:
|
||||||
|
- `db.sqlite3` - the SQLite database
|
||||||
|
- `favicons` - folder that contains downloaded favicons
|
||||||
|
|
||||||
Linkding stores all data in a SQLite database, so all you need to backup are the contents of that database.
|
The following sections explain how to back up the individual contents.
|
||||||
|
|
||||||
The location of the database file is `data/db.sqlite3` in the application folder.
|
## Database
|
||||||
If you are using Docker then the full path in the Docker container is `/etc/linkding/data/db.sqlite`.
|
|
||||||
As described in the installation docs, you should mount the `/etc/linkding/data` folder to a folder on your host system, from which you then can execute the backup.
|
|
||||||
|
|
||||||
Below, we describe several methods to create a backup of the database:
|
This section describes several methods on how to back up the contents of the SQLite database.
|
||||||
|
|
||||||
- Manual backup using the export function from the UI
|
> [!WARNING]
|
||||||
- Create a copy of the SQLite database with the SQLite backup function
|
> While the SQLite database is just a single file, it is not recommended to just copy that file.
|
||||||
- Create a plain textfile with the contents of the SQLite database with the SQLite dump function
|
> This method is not transaction safe and may result in a [corrupted database](https://www.sqlite.org/howtocorrupt.html).
|
||||||
|
> Use one of the backup methods described below.
|
||||||
|
|
||||||
Choose the method that fits you best.
|
### Using the backup command
|
||||||
|
|
||||||
## Exporting from the UI
|
linkding includes a CLI command for creating a backup copy of the database.
|
||||||
|
|
||||||
The least technical option is to use the bookmark export in the UI.
|
To create a backup, execute the following command:
|
||||||
Go to the settings page and open the *Data* tab.
|
|
||||||
Then click on the *Download* button to download an HTML file containing all your bookmarks.
|
|
||||||
You can backup this file on a drive, or an online file host.
|
|
||||||
|
|
||||||
## Using the SQLite backup function
|
|
||||||
|
|
||||||
Requires [SQLite](https://www.sqlite.org/index.html) to be installed on your host system.
|
|
||||||
|
|
||||||
With this method you create a new SQLite database, which is a copy of your linkding database.
|
|
||||||
This method uses the backup command in the [Command Line Shell For SQLite](https://sqlite.org/cli.html).
|
|
||||||
```shell
|
```shell
|
||||||
sqlite3 db.sqlite3 ".backup 'backup.sqlite3'"
|
docker exec -it linkding python manage.py backup backup.sqlite3
|
||||||
```
|
```
|
||||||
After you have created the backup database `backup.sqlite` you have to move it to another system, for example with rsync.
|
This creates a `backup.sqlite3` file in the Docker container.
|
||||||
|
|
||||||
## Using the SQLite dump function
|
To copy the backup file to your host system, execute the following command:
|
||||||
|
```shell
|
||||||
|
docker cp linkding:/etc/linkding/backup.sqlite3 backup.sqlite3
|
||||||
|
```
|
||||||
|
This copies the backup file from the Docker container to the current folder on your host system.
|
||||||
|
Now you can move that file to your backup location.
|
||||||
|
|
||||||
|
To restore the backup, just copy the backup file to the data folder of your new installation and rename it to `db.sqlite3`. Then start the Docker container.
|
||||||
|
|
||||||
|
### Using the SQLite dump function
|
||||||
|
|
||||||
Requires [SQLite](https://www.sqlite.org/index.html) to be installed on your host system.
|
Requires [SQLite](https://www.sqlite.org/index.html) to be installed on your host system.
|
||||||
|
|
||||||
With this method you create a plain text file with the SQL statements to recreate the SQLite database.
|
With this method you create a plain text file with the SQL statements to recreate the SQLite database.
|
||||||
|
To create a backup, execute the following command in the data folder:
|
||||||
```shell
|
```shell
|
||||||
sqlite3 db.sqlite3 .dump > backup.sql
|
sqlite3 db.sqlite3 .dump > backup.sql
|
||||||
```
|
```
|
||||||
|
This creates a `backup.sql` which you can copy to your backup location.
|
||||||
|
As this is a plain text file you can also commit it to any revision management system, like git.
|
||||||
|
Using git, you can commit the changes, followed by a git push to a remote repository.
|
||||||
|
|
||||||
As this is a plain text file you can commit it to any revision management system, like git.
|
### Exporting bookmarks from the UI
|
||||||
Using git you can commit the changes, followed by a git push to a remote repository.
|
|
||||||
|
|
||||||
|
This is the least technical option to back up bookmarks, but has several limitations:
|
||||||
|
- It does not export user profiles.
|
||||||
|
- It only exports your own bookmarks, not those of other users.
|
||||||
|
- It does not export archived bookmarks.
|
||||||
|
- It does not export URLs of snapshots on the Internet Archive Wayback machine.
|
||||||
|
- It does not export favicons.
|
||||||
|
|
||||||
|
Only use this method if you are fine with the above limitations.
|
||||||
|
|
||||||
|
To export bookmarks from the UI, open the general settings.
|
||||||
|
In the Export section, click on the *Download* button to download an HTML file containing all your bookmarks.
|
||||||
|
Then move that file to your backup location.
|
||||||
|
|
||||||
|
To restore bookmarks, open the general settings on your new installation.
|
||||||
|
In the Import section, click on the *Choose file* button to select the HTML file you downloaded before.
|
||||||
|
Then click on the *Import* button to import the bookmarks.
|
||||||
|
|
||||||
|
## Favicons
|
||||||
|
|
||||||
|
Doing a backup of the icons is optional, as they can be downloaded again.
|
||||||
|
|
||||||
|
If you choose not to back up the icons, you can just restore the database and then click the _Refresh Favicons_ button in the general settings.
|
||||||
|
This will download all missing icons again.
|
||||||
|
|
||||||
|
If you want to back up the icons, then you have to copy the `favicons` folder to your backup location.
|
||||||
|
|
||||||
|
To restore the icons, copy the `favicons` folder back to the data folder of your new installation.
|
||||||
|
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "linkding",
|
"name": "linkding",
|
||||||
"version": "1.22.2",
|
"version": "1.23.1",
|
||||||
"description": "",
|
"description": "",
|
||||||
"main": "index.js",
|
"main": "index.js",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
@@ -6,7 +6,7 @@ certifi==2023.7.22
|
|||||||
charset-normalizer==2.1.1
|
charset-normalizer==2.1.1
|
||||||
click==8.1.3
|
click==8.1.3
|
||||||
confusable-homoglyphs==3.2.0
|
confusable-homoglyphs==3.2.0
|
||||||
Django==4.1.10
|
Django==4.1.13
|
||||||
django-generate-secret-key==1.0.2
|
django-generate-secret-key==1.0.2
|
||||||
django-registration==3.3
|
django-registration==3.3
|
||||||
django-sass-processor==1.2.1
|
django-sass-processor==1.2.1
|
||||||
|
@@ -7,7 +7,7 @@ charset-normalizer==2.1.1
|
|||||||
click==8.1.3
|
click==8.1.3
|
||||||
confusable-homoglyphs==3.2.0
|
confusable-homoglyphs==3.2.0
|
||||||
coverage==5.5
|
coverage==5.5
|
||||||
Django==4.1.10
|
Django==4.1.13
|
||||||
django-appconf==1.0.5
|
django-appconf==1.0.5
|
||||||
django-compressor==4.1
|
django-compressor==4.1
|
||||||
django-debug-toolbar==3.6.0
|
django-debug-toolbar==3.6.0
|
||||||
|
@@ -3,6 +3,13 @@
|
|||||||
version=$(<version.txt)
|
version=$(<version.txt)
|
||||||
|
|
||||||
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 \
|
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 \
|
||||||
|
-f docker/default.Dockerfile \
|
||||||
-t sissbruecker/linkding:latest \
|
-t sissbruecker/linkding:latest \
|
||||||
-t sissbruecker/linkding:$version \
|
-t sissbruecker/linkding:$version \
|
||||||
--push .
|
--push .
|
||||||
|
|
||||||
|
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 \
|
||||||
|
-f docker/alpine.Dockerfile \
|
||||||
|
-t sissbruecker/linkding:latest-alpine \
|
||||||
|
-t sissbruecker/linkding:$version-alpine \
|
||||||
|
--push .
|
||||||
|
@@ -1,6 +1,8 @@
|
|||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
docker build -t sissbruecker/linkding:local .
|
variant="${1:-default}"
|
||||||
|
|
||||||
|
docker build -f "docker/$variant.Dockerfile" -t sissbruecker/linkding:local .
|
||||||
|
|
||||||
docker rm -f linkding-local || true
|
docker rm -f linkding-local || true
|
||||||
|
|
||||||
|
@@ -1 +1 @@
|
|||||||
1.22.2
|
1.23.1
|
||||||
|
Reference in New Issue
Block a user