Compare commits

...

478 Commits

Author SHA1 Message Date
Andrés Moya
069995b62f more wip 2022-12-14 16:56:01 +01:00
Andrés Moya
d145716e52 this not fine 2022-12-14 16:56:01 +01:00
Andrés Moya
a4916b8add wip otro más 2022-12-14 16:56:01 +01:00
Andrés Moya
14ee85ccd7 temporary deactivation 2022-12-14 16:56:01 +01:00
Andrés Moya
e8bf9d5f41 wip mejor 2022-12-14 16:56:01 +01:00
Andrés Moya
5f33f54cff 💄 Do some cleanup 2022-12-14 16:56:01 +01:00
Andrés Moya
7fbc47ccdb Avoid setting touched flags in copies 2022-12-14 16:56:01 +01:00
Andrés Moya
8ccd9bedfa wip del gueno 2022-12-14 16:56:01 +01:00
Andrés Moya
a4e36390e2 wip fixes 2022-12-14 16:56:01 +01:00
Andrés Moya
4eaa9394f6 🎉 Show transient changes in component copies 2022-12-14 16:56:01 +01:00
Andrés Moya
9bf0c6e681 🐛 Avoid loops in main instance sync 2022-12-14 16:56:01 +01:00
Alejandro
c9ad82edc3 Merge pull request #2642 from penpot/niwinz-backend-webhooks-4
🎉 Add webhooks processing & errors UI integration
2022-12-14 15:53:11 +01:00
Alejandro
430752383b Merge pull request #2664 from penpot/alotor-improve-thumbnails
 Improve thumbnail generation
2022-12-14 15:26:41 +01:00
alonso.torres
e9064611cf Improve thumbnail generation 2022-12-14 14:38:56 +01:00
alonso.torres
2ce36ce052 Merge remote-tracking branch 'origin/staging' into develop 2022-12-14 13:01:23 +01:00
Alejandro
56870ad68e Merge pull request #2663 from penpot/alotor-hotfix-problem-with-texts
🐛 Fix problem with auto-width/auto-height + lock-proportions
2022-12-14 12:50:18 +01:00
alonso.torres
7507a3b74f 🐛 Fix problem with auto-width/auto-height + lock-proportions 2022-12-14 12:47:12 +01:00
Andrey Antukh
507800ae4e Merge pull request #2659 from penpot/alotor-flex-layout-features
Flex layout features
2022-12-14 09:07:40 +01:00
Andrey Antukh
d56082307b 🎉 Add update-file (batched) to webhooks 2022-12-13 23:14:55 +01:00
Andrey Antukh
782f2ed57d 🎉 Enable comments events on webhooks 2022-12-13 23:13:48 +01:00
Andrey Antukh
d7459db292 🎉 Add task deduplication by label 2022-12-13 23:13:11 +01:00
Pablo Alba
fd7d189bb7 💄 Change style of cancel button on webhooks modal 2022-12-13 22:22:21 +01:00
alonso.torres
5aaaab4f80 Show ghost when moving elemets from/into layout 2022-12-13 17:45:47 +01:00
alonso.torres
03228a9801 🐛 Fix problem with snap pixel 2022-12-13 17:45:47 +01:00
alonso.torres
2fbd1d8078 🐛 Fix problem with hug and item margins 2022-12-13 17:45:47 +01:00
alonso.torres
029efefb62 Import/export layout data 2022-12-13 16:43:28 +01:00
Andrey Antukh
ae79ee435e 🎉 Add many rpc calls to webhooks registry 2022-12-13 16:17:31 +01:00
Andrey Antukh
240e480b2e 🎉 Allow application/json on Accept header 2022-12-13 16:17:31 +01:00
Andrey Antukh
f2b60261f8 🎉 Add tests for webhooks rpc and logger 2022-12-13 16:17:31 +01:00
Andrey Antukh
21abd98b95 Integrate error handling for webhooks UI 2022-12-13 16:17:31 +01:00
Andrey Antukh
edaa62b05b 💄 Replace us/assert with us/assert! on dashboard data ns 2022-12-13 16:17:31 +01:00
Andrey Antukh
5b9f0ed0b1 🎉 Add webhook processing worker 2022-12-13 16:17:31 +01:00
Andrey Antukh
d768711caa Improve null handling on more db helpers 2022-12-13 16:17:31 +01:00
Andrey Antukh
d584ae5a0f Improve json encode/decode api 2022-12-13 16:17:31 +01:00
Andrey Antukh
9debfa3b27 📎 Minor cange on exception formating 2022-12-13 16:17:31 +01:00
Andrey Antukh
c0a4b7dc76 Improve worker queue management
and add specific worker instance for webhooks
2022-12-13 16:17:31 +01:00
Andrey Antukh
7f589b09ca ♻️ Move audit http handler to RPC 2022-12-13 16:17:31 +01:00
Andrey Antukh
27c4cdb5f9 Merge pull request #2658 from penpot/palba-inspect-bugs
Fix inspect bugs
2022-12-13 13:16:27 +01:00
Andrey Antukh
fb0cf6fcbc 📎 Revert some hacky code from previous commit 2022-12-13 13:14:44 +01:00
Andrey Antukh
7ca74c0467 📎 Fix unexpected linter issue 2022-12-13 12:58:57 +01:00
Pablo Alba
cd6aa8f691 🐛 Fix can't select a board in inspect mode 2022-12-13 12:57:32 +01:00
Pablo Alba
90bc9943bc 🐛 Fix expand right sidebar on workspace inspect 2022-12-13 12:46:53 +01:00
Andrey Antukh
fe7b4331d1 Merge pull request #2653 from penpot/alotor-poc-improve-transform
♻️ Changed transform calculation
2022-12-13 12:37:18 +01:00
alonso.torres
e1de3ba5e7 ♻️ Changed transform calculation 2022-12-13 12:36:54 +01:00
Eva Marco
5cd108c21a Merge pull request #2657 from penpot/palba-inspect-bugfixing
🐛 Fix flex elemen info doesn't show on inspect tab
2022-12-13 11:59:54 +01:00
Andrey Antukh
c53420c1f5 Merge pull request #2655 from penpot/eva-fix-login-icon
🐛 Fix home icon in login page
2022-12-13 08:47:45 +01:00
Pablo Alba
05e437ee06 🐛 Fix flex elemen info doesn't show on inspect tab 2022-12-12 16:56:25 +01:00
Eva
d0d63169e2 🐛 Fix home icon in login page 2022-12-12 16:53:05 +01:00
Andrey Antukh
c148326d1c Merge pull request #2648 from penpot/palba-empty-inspect
🎉 Add placeholder for empty inspect tab
2022-12-12 15:32:18 +01:00
Pablo Alba
76a19a82c3 🎉 Add placeholder for empty inspect tab 2022-12-12 15:26:36 +01:00
Andrey Antukh
4d1a22bd11 Merge pull request #2606 from penpot/eva-a11y-login
 Improve login accessibility
2022-12-12 13:53:15 +01:00
Eva
95a18fce8d Improve login accessibility 2022-12-12 13:52:58 +01:00
Andrey Antukh
8bc265a598 Merge pull request #2647 from penpot/alotor-flex-ui-changes
 Changes to the flex UI
2022-12-12 13:50:30 +01:00
alonso.torres
de6cba8c0b Changes to the flex UI 2022-12-12 13:50:14 +01:00
Andrey Antukh
f2fe1dd6f8 Merge pull request #2645 from penpot/palba-webhooks-flag
🎉 Add flag for enabling webhooks
2022-12-12 13:47:33 +01:00
Andrey Antukh
2ec479afd4 Merge pull request #2651 from penpot/alotor-fix-empty-thumbnails
🐛 Fix problems with empty thumbnails
2022-12-12 13:45:03 +01:00
Andrey Antukh
67682fe211 🐛 Fix shape exportation 2022-12-12 07:34:11 +01:00
Andrey Antukh
79f27a849c Merge pull request #2652 from ryanbreen/fix-duplicate-require
💄 Remove duplicate require in backend/dev/user.clj
2022-12-10 14:21:08 +01:00
Ryan Breen
f607540f23 💄 Remove duplicate require in backend/dev/user.clj
Cleanup a duplicate require of srepl

Signed-off-by: Ryan Breen <rbreen@zmags.com>
2022-12-10 07:14:00 -05:00
alonso.torres
8609308cb4 🐛 Fix problems with empty thumbnails 2022-12-10 00:08:01 +01:00
Andrey Antukh
28f1e671cb Merge pull request #2650 from ryanbreen/comment-move-fix
🐛 Fix moving comment threads
2022-12-09 23:31:04 +01:00
Ryan Breen
c411ce248e 🐛 Fix moving comment threads
Moving comment threads was failing with assert errors because the speced type is gpt/point whereas update-comment-thread-position was passing a raw map of coordinates.

Signed-off-by: Ryan Breen <rbreen@zmags.com>
2022-12-09 13:35:02 -05:00
Pablo Alba
d283c6418e 🎉 Add flag for enabling webhooks 2022-12-09 13:00:23 +01:00
alonso.torres
415a3cad7b Calculate selrect from points 2022-12-09 11:01:24 +01:00
Andrey Antukh
36d2f72768 Merge pull request #2640 from penpot/alotor-performance-enhance
Flex layout fixes
2022-12-07 15:39:53 +01:00
alonso.torres
a64d92b005 Change default parameters on selection to layout 2022-12-07 15:05:38 +01:00
alonso.torres
172f4c142b Remove unused functions 2022-12-07 15:05:38 +01:00
alonso.torres
4b55c7a8e0 🐛 Fix problem with text not growing the hug container 2022-12-07 15:05:38 +01:00
alonso.torres
7dbe39b1b5 🐛 Fix problems with cache and thumbnails 2022-12-07 15:05:38 +01:00
alonso.torres
6c2d2e142b Improve reflow texts 2022-12-07 15:05:38 +01:00
Andrey Antukh
2183599c8d Merge pull request #2639 from penpot/palba-webhooks-crud
🎉 Add team webhooks section
2022-12-07 14:39:02 +01:00
Pablo Alba
cdbfec4f19 🎉 Add team webhooks section 2022-12-07 14:08:14 +01:00
Pablo Alba
cb7354a19c Merge pull request #2638 from penpot/superalex-fix-move-the-content-of-a-group-generates-orphan-elements
🐛 Fix move content of a group generates orphan elements
2022-12-07 08:57:19 +01:00
Alejandro Alonso
3157ad79a5 🐛 Fix move content of a group generates orphan elements 2022-12-07 08:38:04 +01:00
Pablo Alba
02d619ed48 Merge pull request #2635 from penpot/niwinz-backend-webhooks-3
🎉 Add webhook RPC API
2022-12-05 17:06:38 +01:00
Andrey Antukh
d97afa0e6d 📎 Add helper/devenv script for kill nonresponsive repl 2022-12-05 15:20:29 +01:00
Andrey Antukh
baade567ca 📎 Improve bundle run template script 2022-12-05 15:20:29 +01:00
Andrey Antukh
39b9daa3a7 🎉 Add webhooks rpc API 2022-12-05 15:20:29 +01:00
Pablo Alba
d8bb62c498 🐛 Fix layer drag enabled on chrome when is readonly 2022-12-05 10:37:56 +01:00
Alejandro
b45a0a979b Merge pull request #2613 from penpot/niwinz-backend-webhooks-2
♻️ Refactor audit and http client module
2022-12-05 10:34:48 +01:00
Andrey Antukh
861328af3e ⬆️ Update promesa library to 10.0.570 2022-12-05 09:06:24 +01:00
Andrey Antukh
8bad9d8340 ♻️ Refactor loggers/audit, auth/oidc, and http/clent modules 2022-12-05 08:53:00 +01:00
Andrey Antukh
7f7efc5760 Improve exception formating on backend 2022-12-05 08:51:54 +01:00
alonso.torres
e43fc0feb0 🐛 Fix masks in layout flex 2022-12-02 10:21:34 +01:00
alonso.torres
e53e715861 Disable feature toggle for layout flex 2022-12-02 10:21:34 +01:00
alonso.torres
32350bcf87 Improved performance for auto-width/auto-height texts 2022-12-02 10:21:34 +01:00
alonso.torres
29b1b4dbc9 🐛 Fix problem with hug layout and groups 2022-12-02 10:21:34 +01:00
Andrey Antukh
2c558a6a02 Merge pull request #2625 from penpot/hiru-detect-version-import
 Include features info in exported zipfiles
2022-12-02 09:19:28 +01:00
Andrey Antukh
95876c271c Merge pull request #2619 from penpot/hiru-fix-blank-page-import
🐛 Fix adding an extra blank page on import
2022-12-01 17:55:51 +01:00
Andrey Antukh
ccff27ac23 Merge pull request #2628 from penpot/hiru-fix-touched-import
🐛 Avoid setting touched flags in imported components
2022-12-01 17:55:30 +01:00
Andrés Moya
148f6cb3c2 🐛 Fix adding an extra blank page on import 2022-12-01 17:15:20 +01:00
alonso.torres
c9dbeec689 🐛 Fix right sidebar style 2022-12-01 17:12:08 +01:00
Andrés Moya
2b7c967920 🐛 Avoid setting touched flags in imported components 2022-12-01 16:01:51 +01:00
Andrés Moya
94cdd4a481 Include features info in exported zipfiles 2022-12-01 15:33:13 +01:00
Andrey Antukh
296b6c646e 🐛 Fix incorrect flag parsing on frontend features 2022-12-01 15:02:56 +01:00
Alejandro Alonso
ad491ccc8f Merge remote-tracking branch 'origin/staging' into develop 2022-12-01 14:50:26 +01:00
Andrey Antukh
ca7ebdcc8f Merge pull request #2626 from penpot/superalex-fix-paste-svg-leads-to-internal-server-error
🐛 Fix paste svg leads to internal server error
2022-12-01 14:29:33 +01:00
Alejandro
efb4b2cb7d Merge pull request #2623 from penpot/niwinz-parse-frontend-features-from-flags
 Parse frontend features from flags
2022-12-01 14:28:19 +01:00
Alejandro Alonso
92403f2afe 🐛 Fix paste svg leads to internal server error 2022-12-01 14:25:35 +01:00
Andrey Antukh
0e949679d9 Parse frontend features from flags
Simplifies setting features on deployments.
2022-12-01 13:43:57 +01:00
Alejandro
1b8e4dfdfa Merge pull request #2621 from penpot/superalex-fix-develop
🐛 Fix develop branch
2022-12-01 11:48:34 +01:00
Alejandro Alonso
afe8883e37 🐛 Fix develop branch 2022-12-01 11:42:18 +01:00
Andrey Antukh
d5398e672f Merge pull request #2614 from penpot/superalex-support-library-colors-on-library-build
🎉 Add support assets management via library
2022-12-01 11:32:29 +01:00
Alejandro Alonso
3252088494 🎉 Add support typography management via library 2022-12-01 11:30:37 +01:00
Alejandro Alonso
fffacf3552 🎉 Add support media management via library 2022-12-01 11:30:33 +01:00
Alejandro Alonso
a19417417a 🎉 Add support components managemente via library 2022-12-01 11:30:29 +01:00
Alejandro Alonso
4c1f2cfded 🎉 Add support color management via library 2022-12-01 11:30:24 +01:00
Alejandro Alonso
a907041564 Merge remote-tracking branch 'origin/staging' into develop 2022-12-01 10:57:32 +01:00
Andrey Antukh
dff4552549 🐛 Set runtime *assert* value to false on release build 2022-12-01 09:31:38 +01:00
Andrey Antukh
a4acdd1886 📎 Explicitly disable features on code 2022-12-01 08:56:23 +01:00
Pablo Alba
c1a1120137 Merge pull request #2601 from penpot/superalex-fix-paste-elements-from-outside-penpot-respect-hierarchy
🐛 Fix paste elements from outside penpot respect hierarchy
2022-11-30 16:40:30 +01:00
Andrey Antukh
32cd32649e Merge pull request #2617 from penpot/palba-workspace-inspect-tab
🎉 Add inspect tab to workspace
2022-11-30 14:56:52 +01:00
Andrey Antukh
678b6a285f Merge pull request #2616 from penpot/hiru-pretty-debug
💄 Make cleaner debug logs
2022-11-30 14:56:20 +01:00
Andrés Moya
de1a3de433 💄 Make cleaner debug logs 2022-11-30 14:07:56 +01:00
Pablo Alba
412564b418 🎉 Add inspect tab to workspace 2022-11-30 13:55:09 +01:00
Andrey Antukh
c451c7bb9d 🐛 Fix regression on worker task handling 2022-11-30 13:13:11 +01:00
alonso.torres
be24989eab 🐛 Fix problem with hug layout 2022-11-30 12:21:00 +01:00
alonso.torres
a439fb65ce 🐛 Fix multiple selection in layout container and items 2022-11-30 11:49:08 +01:00
Andrey Antukh
c98635bca1 Merge pull request #2610 from penpot/alotor-performance-enhance
Alotor performance enhance
2022-11-30 10:42:09 +01:00
alonso.torres
0d2b228eb7 Keep group constraint behaviour inside flex layout 2022-11-30 10:00:30 +01:00
alonso.torres
c79d549f53 Change modifiers to records 2022-11-30 10:00:30 +01:00
alonso.torres
600f9ef071 Performance improvements 2022-11-30 10:00:30 +01:00
Andrey Antukh
04243be4a5 📎 Update frontend bench namespace 2022-11-30 10:00:30 +01:00
Andrey Antukh
fc4e755f2b Optimize point functions 2022-11-30 10:00:30 +01:00
Andrey Antukh
c28534555b 📎 Add minor microptimizations and tests to points->rect 2022-11-30 10:00:30 +01:00
Andrey Antukh
380cba3a72 📎 Add bench namespace to fronend/dev 2022-11-30 10:00:29 +01:00
Andrey Antukh
89a19dec5b 🎉 Add cljs optimized get-prop helper macro 2022-11-30 10:00:29 +01:00
Andrey Antukh
f6305db2a8 Reorganize a bit the common.data ns 2022-11-30 10:00:29 +01:00
Andrey Antukh
197eff93e8 📎 Fix nodejs compatibility issue on uuid_impl 2022-11-30 10:00:29 +01:00
Andrey Antukh
12cc5c6c97 Merge pull request #2600 from penpot/palba-workspace-read-only
🎉 Add workspace read-only setting
2022-11-29 17:45:37 +01:00
Pablo Alba
cd47c0356a 🎉 Add workspace read-only setting 2022-11-29 17:28:00 +01:00
Andrey Antukh
1c2a462124 Merge pull request #2596 from penpot/niwinz-backend-webhooks
 Improve scalability of the worker abstraction
2022-11-28 12:46:41 +01:00
Andrey Antukh
329b1eb6f3 📎 Fix on test initialization on CI 2022-11-28 12:37:48 +01:00
Andrey Antukh
bcfb4e0f81 🐛 Fix metrics code on rpc.climit 2022-11-28 12:25:30 +01:00
Andrey Antukh
69011007ac Change execution model of storage.tmp cleaner 2022-11-28 12:21:46 +01:00
Andrey Antukh
0600b2abe4 ♻️ Make the worker abstraction more scalable
Start using redis for dispatcher to worker communication
and add the ability to start multiple threads to worker
for increase the concurrency.
2022-11-28 12:21:44 +01:00
Andrey Antukh
13a092b192 ♻️ Normalize internal naming on the worker module 2022-11-28 12:20:58 +01:00
Andrey Antukh
10bf6c5e56 ♻️ Normalize redis api and its usage in msgbus module 2022-11-28 12:20:58 +01:00
Alejandro Alonso
427e43585c 🐛 Fix strokes class 2022-11-28 09:56:52 +01:00
Andrey Antukh
667fabbdc5 Merge pull request #2602 from penpot/alotor-performance-enhance
 Viewport and hug performance enhances
2022-11-28 09:33:12 +01:00
Eva Marco
8413a8eb3e Merge pull request #2599 from penpot/niwinz-restore-search-rpc
🎉 Restore removed by mistake search rpc method
2022-11-28 09:02:05 +01:00
alonso.torres
f579bb0c8d Viewport and hug performance enhances 2022-11-25 16:08:41 +01:00
Alejandro Alonso
a2b70f227c 🐛 Fix paste elements from outside penpot respect hierarchy 2022-11-25 14:09:23 +01:00
Andrey Antukh
706714d557 🎉 Restore removed by mistake search rpc method 2022-11-25 10:42:17 +01:00
Andrey Antukh
399d57ace0 Merge branch 'iprithvitharun-2588-invitations-page-improvements' into develop 2022-11-24 16:53:16 +01:00
Prithvi Tharun
f2525f8159 Add several improvements to invitations page 2022-11-24 16:53:00 +01:00
Andrey Antukh
0fece05cc9 Merge pull request #2598 from penpot/superalex-update-translations-from-develop
 Update translations from develop
2022-11-24 16:45:02 +01:00
Alejandro Alonso
13c7d06353 Update translations from develop 2022-11-24 16:44:36 +01:00
Andrey Antukh
9593ded808 📎 Add missing copy on postgres upgrade script 2022-11-24 13:38:53 +01:00
Andrey Antukh
99adbbe91d 📎 Add postgres-upgrade.sh script 2022-11-24 12:47:41 +01:00
Andrey Antukh
6f1c2f474b 📎 Add missing change on devenv compose 2022-11-24 12:21:58 +01:00
Andrey Antukh
0061b37c13 Merge pull request #2597 from penpot/alotor-performance-enhance
 Improve transforms performance
2022-11-24 11:06:34 +01:00
alonso.torres
69bb4654c9 Improve transforms performance 2022-11-24 10:01:37 +01:00
Eva
694d90d485 Add id functionality to undo transactions 2022-11-24 10:00:32 +01:00
Eva
32746a5960 🐛 Fix some layout errors 2022-11-24 10:00:32 +01:00
Andrey Antukh
7c3f87d7b0 Merge pull request #2585 from penpot/superalex-fix-show-board-miniature-in-manual-overlay-setting
🐛 Fix show board miniature in manual overlay setting
2022-11-24 09:07:17 +01:00
Alejandro Alonso
b4e4a5cab4 🐛 Fix show board miniature in manual overlay setting 2022-11-24 09:06:42 +01:00
Andrey Antukh
c12c9a4419 Merge pull request #2566 from penpot/hiru-remove-graphics-popup
 Enhance modal of convert graphics into components
2022-11-24 09:06:04 +01:00
Andrés Moya
cc60cfc86d Enhance modal of convert graphics into components 2022-11-24 09:05:50 +01:00
Andrey Antukh
879c477ada Merge remote-tracking branch 'origin/staging' into develop 2022-11-24 08:52:35 +01:00
Alejandro
0a72859424 Merge pull request #2594 from penpot/niwinz-compatibility-fixes
🐛 Fix compatibility issue with 1.17.x file table changes
2022-11-23 14:49:16 +01:00
Andrey Antukh
6b7adec617 Merge pull request #2592 from penpot/palba-handoff-improvements
 Add several improvements to handoff
2022-11-23 13:49:36 +01:00
Andrey Antukh
e7865b8643 🐛 Fix compatibility issue with 1.17.x file table changes 2022-11-23 12:44:04 +01:00
Pablo Alba
461e5cb376 Add several improvements to handoff 2022-11-23 09:44:04 +01:00
Andrey Antukh
77a397de0c Merge pull request #2586 from penpot/hiru-fix-min-size
🐛 Fix min size when creating shapes
2022-11-23 09:39:18 +01:00
Andrey Antukh
c656dd146c Merge pull request #2589 from penpot/alotor-fixes
Flex layout fixes
2022-11-22 17:15:40 +01:00
alonso.torres
441e142349 🐛 Fix reflow layout when changes in paths and texts 2022-11-22 17:08:23 +01:00
alonso.torres
54fd836dd4 🐛 Fix problem when dragging layout children with frames 2022-11-22 16:41:44 +01:00
alonso.torres
7ffdf21657 🐛 Fix paste shapes into layout 2022-11-22 16:40:20 +01:00
Andrés Moya
8a6f1d82e5 🐛 Fix min size when creating shapes 2022-11-22 15:38:04 +01:00
Andrey Antukh
87ebb2e24c ⬆️ Update yetti dependency to v9.11 2022-11-22 14:06:14 +01:00
Andrey Antukh
9334138510 🐛 Fix persistence loop on dev code hot reload 2022-11-22 14:06:14 +01:00
Andrey Antukh
1b9dea01e2 🔥 Remove unused d/update-vals function
Already available as clojure.core/update-vals
2022-11-22 14:06:14 +01:00
Andrey Antukh
ccb7c466bf 🎉 Add lazy loading and storage/pointer-map support on viewer 2022-11-22 14:06:14 +01:00
Andrey Antukh
c72be4ae2a ⬆️ Update redis and postgresql on devenv docker 2022-11-22 14:06:14 +01:00
Andrey Antukh
fbd042d4ee 🐛 Remove not working chrono-units on duration constructor 2022-11-22 14:06:14 +01:00
Andrey Antukh
bbf95434d8 🎉 Add lazy loading and storage/pointer-map support on workspace
This also rewrites the workspace load process making it a bit more
efficient independently if lazy loading is used.
2022-11-22 14:06:14 +01:00
alonso.torres
2a46989ec9 🐛 Fix problem with flex direction 2022-11-22 12:06:12 +01:00
alonso.torres
baf9124304 🐛 Fix problem with texts in viewer 2022-11-22 12:05:59 +01:00
Andrey Antukh
c69d4820cb Merge pull request #2584 from penpot/superalex-add-core-ui-to-libraries-and-templates-carousel 2022-11-22 11:50:36 +01:00
Alejandro Alonso
7d48714aa2 🎉 Add core-ui to libraries and templates carousel 2022-11-22 11:50:15 +01:00
Prithvi Tharun
6565655ac3 🐛 Fix the corners icon name
Closes #2581
2022-11-22 11:46:32 +01:00
Andrey Antukh
d886889334 Merge pull request #2578 from penpot/superalex-interaction-open-url-after-delay-duplicate-tab
🐛 Interaction for open url after delay duplicates tab
2022-11-22 11:45:38 +01:00
Alejandro Alonso
a95a7b9f90 🐛 Fix interaction for open url after delay duplicates tab 2022-11-22 11:45:18 +01:00
Pablo Alba
3d381b92d9 Merge pull request #2580 from iprithvitharun/2579-updating-login-error-msg
 Login error message updated
2022-11-21 16:45:53 +01:00
Prithvi Tharun
08399ebac1 Error message updated
Updated to be accurate and concise.
2022-11-21 19:31:33 +05:30
Andrey Antukh
6a296a3e52 Merge pull request #2575 from penpot/hiru-reset-changes-deleted-component 2022-11-21 13:59:05 +01:00
Andrés Moya
af03f720b0 Allow to reset changes from a deleted component 2022-11-21 13:58:31 +01:00
Andrey Antukh
5400fdb293 Merge pull request #2573 from penpot/hiru-delete-copy-elements
🎉 When deleting a shape inside a component copy, just hide it
2022-11-21 11:56:50 +01:00
Andrés Moya
831839080f 🎉 When deleting a shape inside a component copy, just hide it 2022-11-21 11:56:15 +01:00
Andrey Antukh
8b7310032b Merge pull request #2562 from penpot/hiru-show-main-component
 Improve show main component menu
2022-11-20 20:26:07 +01:00
Andrés Moya
848f5125d8 Improve show main component menu 2022-11-20 20:25:53 +01:00
Andrey Antukh
9fd778f9c1 Merge pull request #2561 from penpot/hiru-bulk-import-graphics 2022-11-20 20:24:55 +01:00
Andrés Moya
ce7852329a 🎉 Add the ability to create components in bulk from images 2022-11-20 20:24:33 +01:00
Pablo Alba
527e4643da Merge pull request #2565 from penpot/eva-autolayout-code
Eva autolayout code
2022-11-18 13:17:13 +01:00
Eva
e3616ea2b5 🔧 Update highlight.js library 2022-11-18 09:42:50 +01:00
Eva
2a2b5c7dba Add code block to layout elements 2022-11-18 09:42:50 +01:00
Alejandro Alonso
93bbe1b2f8 Merge remote-tracking branch 'origin/staging' into develop 2022-11-17 12:21:56 +01:00
Andrey Antukh
3de217a52e Merge pull request #2537 from penpot/alotor-autolayout-v2
Autolayout & refactor transformations
2022-11-17 11:35:41 +01:00
alonso.torres
afa6a97693 Fixes problem with bool shapes 2022-11-17 09:29:47 +01:00
alonso.torres
32756db1c1 Redone the calculus of sizing auto 2022-11-17 09:29:47 +01:00
alonso.torres
efc1b87ab0 Performance improvements 2022-11-17 09:29:47 +01:00
alonso.torres
7b2f0303e8 Fixed problems with masks 2022-11-17 09:29:47 +01:00
alonso.torres
4c5e8f42ce Review changes 2022-11-17 09:29:47 +01:00
alonso.torres
6e35b5c6b6 Fixed problem with new modifiers and text auto-heigh 2022-11-17 09:29:47 +01:00
alonso.torres
39041bb63b Fix problem with constraints 2022-11-17 09:29:47 +01:00
Eva
56efb571be Add add flex layout option in context menu 2022-11-17 09:29:47 +01:00
alonso.torres
c1affe75e1 Fix problem with fixed constraints 2022-11-17 09:29:47 +01:00
alonso.torres
cdaba395c4 Small fixes for flex layout 2022-11-17 09:29:45 +01:00
alonso.torres
e61e76a074 Fix problems with flipped layouts 2022-11-17 09:29:31 +01:00
alonso.torres
a2e26210d1 Renamed modifiers functions 2022-11-17 09:29:31 +01:00
alonso.torres
b5df7bbfc5 Remove constraints when autolayout 2022-11-17 09:29:30 +01:00
alonso.torres
7375eed18f Refactor modifiers 2022-11-17 09:29:03 +01:00
alonso.torres
861eb283e8 Flex layout small fixes 2022-11-17 09:29:03 +01:00
alonso.torres
c86d88834e Fix problems moving frames 2022-11-17 09:29:03 +01:00
alonso.torres
7caf4b9136 Removed constraints when layout child 2022-11-17 09:29:03 +01:00
alonso.torres
4ecc166055 Remove fill/auto when resizing 2022-11-17 09:29:03 +01:00
alonso.torres
7f0054959f Hug content to frames 2022-11-17 09:29:03 +01:00
alonso.torres
0274567d83 Visual adjustments 2022-11-17 09:29:03 +01:00
alonso.torres
cebda20dd4 Adapt to handoff changes 2022-11-17 09:29:03 +01:00
alonso.torres
94602feab1 Updated translation keys 2022-11-17 09:29:03 +01:00
alonso.torres
503a1dabac Align self and stretch fixes 2022-11-17 09:29:03 +01:00
alonso.torres
81d2f9dd9d Adds align-content options 2022-11-17 09:29:03 +01:00
alonso.torres
4b61e3228f Add min/max width/height for elements 2022-11-17 09:29:03 +01:00
alonso.torres
b8c90fdcf3 Refactor flex layout namespace 2022-11-17 09:29:03 +01:00
alonso.torres
58fd20094a Adapted dynamic modifiers and options for new modifiers 2022-11-17 09:29:03 +01:00
alonso.torres
af098bb64d Adds integration with new UI 2022-11-17 09:29:03 +01:00
alonso.torres
11f347941e Refactor for new modifiers 2022-11-17 09:29:03 +01:00
alonso.torres
c3ed46d3ab Move auto-layout children 2022-11-17 09:29:03 +01:00
alonso.torres
025cac0228 Drop-zone autolayout calculation 2022-11-17 09:29:03 +01:00
alonso.torres
8bcb9e1976 Autofill vectors calculation 2022-11-17 09:29:03 +01:00
alonso.torres
bc890a0b33 Refactor frames 2022-11-17 09:29:03 +01:00
alonso.torres
8d9ed4f8af Fill elements in auto-layout 2022-11-17 09:29:03 +01:00
alonso.torres
c01c46041d Adds autolayout positions calculations 2022-11-17 09:29:02 +01:00
alonso.torres
5050c35257 Adds layout items options 2022-11-17 09:26:17 +01:00
Pablo Alba
3c424786a7 Show board miniature in manual overlay setting 2022-11-16 14:31:44 +01:00
Pablo Alba
1affb53a26 Better overlays interactions on boards inside boards 2022-11-16 14:31:44 +01:00
Andrey Antukh
58dbe21544 Merge branch 'hiru-remove-graphics' into develop 2022-11-14 14:44:55 +01:00
Andrés Moya
6b1ecfd89c 📎 Add some code enhancements 2022-11-14 14:44:18 +01:00
Andrés Moya
20738545b8 🎉 Transform graphics into components 2022-11-11 09:56:40 +01:00
Andrey Antukh
8852ed815f 🐛 Fix unexpected exception on file-gc cron task 2022-11-10 14:54:50 +01:00
Andrey Antukh
fde03e21b0 🎉 Add conditional reading to RPC 2022-11-10 14:54:50 +01:00
Andrey Antukh
5192b36669 💄 Add some cosmetic adjustements 2022-11-10 14:54:50 +01:00
Andrey Antukh
b20d2badfe Load workspace thumbnails in a separated request 2022-11-10 14:54:50 +01:00
Andrey Antukh
dfb73192b8 📎 Change rpc middleware order (minor) 2022-11-10 14:54:50 +01:00
Andrey Antukh
59ba87d9cd Properly report malformed json error 2022-11-10 14:54:50 +01:00
Andrey Antukh
38ed3b076a 🔥 Remove unused config entry 2022-11-10 14:54:50 +01:00
Andrey Antukh
f3472fcd79 📚 Add documentation to objects and pointer map namespaces 2022-11-08 13:02:14 +01:00
Andrey Antukh
3ef99c287e ♻️ Refactor tests directory structure 2022-11-08 13:02:14 +01:00
Andrey Antukh
12e2d3ad96 📎 Rename app-tests to backend-tests for naming consistency 2022-11-08 13:02:14 +01:00
Andrey Antukh
0dc3dba428 📎 Set definitive fressian handler ids for objects and pointer map 2022-11-08 13:02:14 +01:00
Andrey Antukh
efb0ec46bf 🎉 Add tests for pointer map 2022-11-08 13:02:14 +01:00
Andrey Antukh
aa9e125e31 🎉 Add tests for objects map 2022-11-08 13:02:14 +01:00
Andrey Antukh
16afa90b9c Improve internal impl of objects-map 2022-11-08 13:02:14 +01:00
Andrey Antukh
fa93e5a1a7 ♻️ Refactor backend tests directory tree 2022-11-08 13:02:14 +01:00
Andrey Antukh
1298956d92 Improve srepl helpers for activate profile 2022-11-08 13:02:14 +01:00
Andrey Antukh
67b4d5a1c7 🐛 Fix issues when RPC returns nil values 2022-11-08 13:02:14 +01:00
Andrey Antukh
bfccae2373 🔥 Remove unused namespace 2022-11-08 13:02:14 +01:00
Andrey Antukh
5d9606f4d0 📎 Update .gitignore file 2022-11-08 13:02:14 +01:00
Andrey Antukh
76333cec26 🎉 Integrate storage/pointer-map file feature 2022-11-08 13:02:14 +01:00
Andrey Antukh
a42d7164ad 🎉 Add more helpers on srepl ns 2022-11-08 13:02:14 +01:00
Andrey Antukh
c027de2592 Make nil safe some decode helpers on db ns 2022-11-08 13:02:14 +01:00
Andrey Antukh
ce99ca0aa8 🎉 Add generic PointerMap abstraction 2022-11-08 13:02:14 +01:00
Andrey Antukh
751b99bf47 ⬆️ Update devenv dockerfiles 2022-11-08 13:02:14 +01:00
Andrey Antukh
67fc499001 Merge branch 'niwinz-enhancements' into develop 2022-11-07 16:57:35 +01:00
Andrey Antukh
6713d8eb3f Make metrics more modular 2022-11-07 11:18:22 +01:00
Andrey Antukh
e36d611f19 🔥 Remove obsolete code from scripts/repl 2022-11-07 11:17:49 +01:00
Andrey Antukh
111cf54ff6 ♻️ Refactor common module tests and add generative testing for types 2022-11-07 10:22:16 +01:00
Andrey Antukh
1f73558f1b 📎 Fix linter issues 2022-11-07 10:05:56 +01:00
Andrey Antukh
37ad04d2a6 🎉 Add robust concurrency limiter for RPC 2022-11-07 10:05:56 +01:00
Andrey Antukh
6ad9a5aadb 🐛 Fix many bugs on rlimit module 2022-11-07 08:37:43 +01:00
Andrey Antukh
9c33dc529d Improve error report list template 2022-11-07 08:37:43 +01:00
Andrey Antukh
82d72fd388 🐛 Add missing profile-id on the internal error reports 2022-11-07 08:37:43 +01:00
Andrey Antukh
43ab19f690 ♻️ Refactor (minor) of http session code
The rationale behind the refactor:
- Make available profile data to other middlewares without
  the need to access to the database (mainly for error reporting).
- Align with codestyle with the rest of internal modules.
- Simplify code.
2022-11-07 08:37:43 +01:00
Andrey Antukh
dbe516f725 ⬆️ Update deps (bugfixes on deps) 2022-11-07 08:37:43 +01:00
Andrés Moya
358d25680b 🐛 Yet some more typos 2022-11-03 15:37:05 +01:00
Andrey Antukh
57e7691e66 Merge remote-tracking branch 'origin/staging' into develop 2022-11-03 14:14:27 +01:00
Andrés Moya
ee4f063889 🐛 Fix one more typo 2022-11-03 14:13:31 +01:00
Andrés Moya
38d74b93b3 🐛 Fix some typos in library modals 2022-11-03 14:11:48 +01:00
Andrey Antukh
a85a65a554 Merge branch 'staging' into develop 2022-10-31 10:44:15 +01:00
Eva
d663d2bebf Add new handoff flex item section" 2022-10-27 12:17:25 +02:00
Andrés Moya
c3fe8c8ebd 🐛 Upgrade gitpod docker file 2022-10-27 11:39:19 +02:00
Pablo Alba
afe4250ea9 Merge pull request #2471 from penpot/eva-unpublish-several-libs
🎉 Add unpublish option on context menu
2022-10-27 07:36:33 +02:00
Eva
2375f9ab83 🎉 Add unpublish option on context menu 2022-10-26 09:25:13 +02:00
Pablo Alba
8f325e4303 Merge pull request #2485 from penpot/niwinz-new-langs
Add new langs and some fixes
2022-10-25 17:12:20 +02:00
Pablo Alba
cc577a21db Merge pull request #2528 from penpot/translations
Translations
2022-10-25 17:08:28 +02:00
Pablo Alba
76b235e608 Update translations 2022-10-25 17:07:07 +02:00
Pablo Alba
b98cf29134 🌐 Add translations for: German.
Currently translated at 100.0% (1215 of 1215 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-25 14:22:23 +02:00
Tummas Jóhan Sigvardsen
cc06bb7755 🌐 Add translations for: Faroese.
Currently translated at 7.7% (94 of 1215 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-25 14:20:35 +02:00
Vin
bd1003e383 🌐 Add translations for: Russian.
Currently translated at 67.1% (816 of 1215 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ru/
2022-10-25 14:20:35 +02:00
Eva Marco
627f497e7f Merge pull request #2497 from penpot/palba-confirm-unpublish-library
 Confirm unpublish library on libraries popup
2022-10-25 08:59:18 +02:00
Andrés Moya
b191df0351 🐛 Fix bug about decoding :features PgArray 2022-10-24 12:27:31 +02:00
Bogi Napoleon Wennerstrøm
76675e1949 🌐 Add translations for: Faroese.
Currently translated at 6.5% (80 of 1215 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-22 14:05:04 +02:00
Pablo Alba
66055a0b14 Confirm unpublish library on libraries popup 2022-10-21 16:25:25 +02:00
Alejandro Alonso
2ee15c3147 Merge remote-tracking branch 'origin/staging' into develop 2022-10-21 13:22:34 +02:00
Alejandro Alonso
52239a9670 Merge remote-tracking branch 'origin/staging' into develop 2022-10-20 12:44:58 +02:00
Alejandro Alonso
e6b2c40441 Merge remote-tracking branch 'origin/staging' into develop 2022-10-20 12:12:54 +02:00
Eva
e16da8bd2d 🎉 Add new flex layout menu 2022-10-20 11:53:32 +02:00
Andrey Antukh
f51e35aa9c 🐛 Prevent duplicate locale watcher on hot code reload 2022-10-20 00:11:20 +02:00
Andrey Antukh
6323c3ac92 🐛 Fix autodetect language issues 2022-10-20 00:06:50 +02:00
Andrey Antukh
59e6ef5609 🎉 Add new langs (gl, ja_jp, pt_pt, hr) 2022-10-20 00:02:24 +02:00
nautilusx
eafb723415 🌐 Add translations for: German.
Currently translated at 100.0% (1215 of 1215 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-20 00:02:23 +02:00
Andrey Antukh
5463671db1 Merge pull request #2477 from penpot/palba-cleanup-translations
Cleanup translations
2022-10-19 16:32:07 +02:00
Pablo Alba
c24596b7f9 📎 Clean old translations 2022-10-19 16:23:24 +02:00
Pablo Alba
47be9a21f4 🎉 New script to find unused translations 2022-10-19 16:23:01 +02:00
Andrés Moya
39c601a51f 🐛 Fix small import problem 2022-10-19 11:14:07 +02:00
Alejandro Alonso
6894d90137 Merge remote-tracking branch 'origin/staging' into develop 2022-10-19 09:06:11 +02:00
Alejandro Alonso
cdb4524c45 Merge remote-tracking branch 'origin/staging' into develop 2022-10-19 07:56:54 +02:00
Hosted Weblate
89a27e298d Update translation files
Updated by "Cleanup translation files" hook in Weblate.

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/
2022-10-18 20:20:18 +02:00
Hosted Weblate
9df8935d48 Update translation files
Updated by "Cleanup translation files" hook in Weblate.

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/
2022-10-18 20:20:15 +02:00
Tummas Jóhan Sigvardsen
fb3d6b04af 🌐 Add translations for: Faroese.
Currently translated at 5.5% (67 of 1216 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-18 20:20:13 +02:00
Yaron Shahrabani
66c086d4d3 🌐 Add translations for: Hebrew.
Currently translated at 100.0% (1216 of 1216 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/he/
2022-10-18 20:20:13 +02:00
nautilusx
5e55dddd87 🌐 Add translations for: German.
Currently translated at 100.0% (1216 of 1216 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-18 20:20:12 +02:00
Oğuz Ersen
bc0f0064ed 🌐 Add translations for: Turkish.
Currently translated at 100.0% (1216 of 1216 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/tr/
2022-10-18 20:20:12 +02:00
Tummas Jóhan Sigvardsen
ca8919dff0 🌐 Add translations for: Faroese.
Currently translated at 4.7% (58 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-18 20:20:11 +02:00
Stas Haas
5aeac28f36 🌐 Add translations for: German.
Currently translated at 100.0% (1214 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-18 20:20:11 +02:00
Pablo Alba
a6113df552 🐛 Remove unused translation 2022-10-18 20:14:41 +02:00
Andrey Antukh
f28b62cd3d 🐛 Fix inconsistencies on common/types specs 2022-10-18 15:49:18 +02:00
Andrey Antukh
8de1ae0478 📎 Add update-file process time log entry 2022-10-18 15:49:18 +02:00
Andrey Antukh
4fe767c169 Add missing type hints on binfile ns 2022-10-18 15:49:18 +02:00
Andrey Antukh
e50137d186 Minor improvement on internal RPC metadata api 2022-10-18 15:49:18 +02:00
Andrey Antukh
8e6b93e2a7 📎 Set correct license holder on common module sources 2022-10-18 15:49:18 +02:00
Andrey Antukh
2befad433f Remove unnecesary index building on :mov-objects 2022-10-18 15:49:18 +02:00
Andrey Antukh
96af4e26b0 Improve performance issues on backend shape validation 2022-10-18 15:49:18 +02:00
Andrey Antukh
3dc2c52f64 🐛 Fix compatibility issues with jdk19 on util/async ns 2022-10-18 15:49:18 +02:00
Andrey Antukh
b2cbb1e60f Update srepl helpers 2022-10-18 15:49:18 +02:00
Andrey Antukh
c0eab96253 Do not return the whole file on file rename 2022-10-18 15:49:18 +02:00
Andrey Antukh
951b3eb4fe Integrate objects-map and introduce file feature flags 2022-10-18 15:49:18 +02:00
Andrey Antukh
69f084e1df Add deleted at index to file table 2022-10-18 15:49:18 +02:00
Andrey Antukh
c4104c816b 🎉 Add serialization optimized ObjectsMap data type 2022-10-18 15:49:18 +02:00
Andrey Antukh
4ece0cdeda Make transit module extensible 2022-10-18 15:49:18 +02:00
Andrey Antukh
b1296ef765 Make fressian module extensible 2022-10-18 15:49:18 +02:00
Andrey Antukh
5fe3842d1e 🎉 Add v5 blob format (lz4 framed, less gc) 2022-10-18 15:49:18 +02:00
Andrey Antukh
d71c5e4105 📎 Add another print preference method 2022-10-18 15:49:18 +02:00
Andrey Antukh
8ad4dfe454 📎 Minor changes on user namespace 2022-10-18 15:49:18 +02:00
Andrés Moya
c23167a455 Merge pull request #2474 from penpot/translations
Update Translations
2022-10-18 15:33:11 +02:00
Pablo Alba
5a6b7800d7 Update translation files 2022-10-18 13:05:14 +02:00
Pablo Alba
3e118177d0 Merge remote-tracking branch 'weblate/develop' into translations 2022-10-18 12:58:17 +02:00
Andrey Antukh
aaf645bad4 Merge pull request #2459 from penpot/hiru-refactor-types
♻️ Move internal.xxx namespaces to separated files
2022-10-18 11:58:46 +02:00
Bogi Napoleon Wennerstrøm
00e724ce09 🌐 Add translations for: Faroese.
Currently translated at 4.4% (54 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-18 10:01:34 +02:00
Tummas Jóhan Sigvardsen
8451444861 🌐 Add translations for: Faroese.
Currently translated at 4.4% (54 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-18 10:01:34 +02:00
Stas Haas
ef5bc687ab 🌐 Add translations for: German.
Currently translated at 100.0% (1214 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-18 10:01:34 +02:00
Andrés Moya
8463d501cd ♻️ Remove some undeclared namespaces 2022-10-18 09:54:47 +02:00
Andrey Antukh
a59ca5b781 Merge pull request #2443 from penpot/palba-figma-plugin
🎉 Basic changes to use penpot as a library
2022-10-18 09:07:33 +02:00
Pablo Alba
369dc8ffb5 🎉 Basic changes to use penpot as a library 2022-10-18 08:51:28 +02:00
Andrés Moya
04f8bbb1f2 📚 Fix copyright 2022-10-17 16:32:00 +02:00
Alejandro Alonso
10e0cf121b Merge remote-tracking branch 'origin/staging' into develop 2022-10-17 15:21:11 +02:00
Bogi Napoleon Wennerstrøm
948bda7cc8 🌐 Add translations for: Faroese.
Currently translated at 1.0% (13 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-16 15:49:43 +02:00
Tummas Jóhan Sigvardsen
8baaae1770 🌐 Add translations for: Faroese.
Currently translated at 1.0% (13 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fo/
2022-10-16 15:49:43 +02:00
ascarida
ea10ec22c2 🌐 Add translations for: Galician.
Currently translated at 30.6% (372 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/gl/
2022-10-16 15:49:42 +02:00
Yaron Shahrabani
160e0d218b 🌐 Add translations for: Hebrew.
Currently translated at 100.0% (1214 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/he/
2022-10-16 15:49:42 +02:00
Andrés Moya
7e70f0ce30 ♻️ Move internal.xxx namespaces to separated files 2022-10-14 18:25:04 +02:00
Pablo Alba
0618aa32a0 Merge pull request #2407 from penpot/hiru-restore-remote-comp
🐛 Make restore component work in external libraries
2022-10-14 18:08:23 +02:00
Alejandro
8b1e8408f2 Merge pull request #2435 from penpot/eva-fix-loading-placeholder
🐛 Fix loading placeholder
2022-10-11 15:59:17 +02:00
Eva
796211c655 🐛 Fix loading placeholder 2022-10-11 14:16:17 +02:00
Alejandro Alonso
0afef0fa44 Merge remote-tracking branch 'origin/staging' into develop 2022-10-11 14:02:51 +02:00
Stas Haas
fca26f4022 🌐 Add translations for: German.
Currently translated at 99.0% (1203 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-10-10 19:35:00 +02:00
Oğuz Ersen
5caaa2d593 🌐 Add translations for: Turkish.
Currently translated at 100.0% (1214 of 1214 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/tr/
2022-10-10 19:34:59 +02:00
Alejandro Alonso
c4c419b971 Merge remote-tracking branch 'origin/staging' into develop 2022-10-10 18:10:52 +02:00
Eva Marco
bcd9aa7ba7 Merge pull request #2411 from penpot/hiru-bug-sync-undo
🐛 Fix component sync in undo
2022-10-10 17:23:46 +02:00
ascarida
5423999913 🌐 Add translations for: Galician.
Currently translated at 30.2% (366 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/gl/
2022-10-10 10:05:33 +02:00
Valentina Chapellu
670365acb7 🌐 Add translations for: Italian.
Currently translated at 43.5% (526 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-10-10 10:05:33 +02:00
Shuaib Zahda
9915990e10 🌐 Add translations for: Arabic.
Currently translated at 66.6% (806 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-10-10 10:05:32 +02:00
Andrey Antukh
748ab5f75e Merge branch 'translations' into develop 2022-10-10 10:05:11 +02:00
Andrey Antukh
b995830693 📎 Sort translation files 2022-10-10 10:04:11 +02:00
Andrey Antukh
d47d4c2d58 Merge remote-tracking branch 'weblate/develop' into translations 2022-10-10 09:58:38 +02:00
Andrey Antukh
4b2b7278a7 Merge branch 'luzpaz-source-typos' into develop 2022-10-10 09:52:06 +02:00
Andrey Antukh
85bd44e37b 📎 Update translation files 2022-10-10 09:51:37 +02:00
luz paz
374909e05e 🔧 Fix typos in source code (follow-up)
This is a follow-up to e30bea0b6f but fixes source typos.
2022-10-10 09:48:38 +02:00
Alejandro Alonso
e3f0c2eaeb 🌐 Added translation for: Faroese. 2022-10-10 08:47:38 +02:00
K.B.Dharun Krishna
919fb96b34 🌐 Add translations for: Tamil.
Currently translated at 2.0% (25 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ta/
2022-10-07 04:17:08 +02:00
Alejandro Alonso
c5ff785ff5 Merge remote-tracking branch 'origin/staging' into develop 2022-10-05 11:17:13 +02:00
Andrés Moya
66cd60e02c 🐛 Fix component sync in undo 2022-10-05 10:39:32 +02:00
Alejandro Alonso
da33d539bf Merge remote-tracking branch 'origin/staging' into develop 2022-10-05 09:34:18 +02:00
Denys M
c925528212 🌐 Add translations for: Ukrainian (ukr_UA).
Currently translated at 19.6% (238 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ukr_UA/
2022-10-04 20:22:42 +02:00
Zvonimir Juranko
fc44610893 🌐 Add translations for: Croatian.
Currently translated at 93.0% (1125 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/hr/
2022-10-04 20:22:41 +02:00
Kevin Nowald
ccb17e68e2 🌐 Add translations for: Polish.
Currently translated at 94.8% (1147 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pl/
2022-10-04 20:22:41 +02:00
Youkho
5bdc2cc25d 🌐 Add translations for: Arabic.
Currently translated at 65.0% (786 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-10-04 20:22:40 +02:00
Antonio
f466d7a484 🌐 Add translations for: Catalan.
Currently translated at 96.2% (1164 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ca/
2022-10-04 20:22:40 +02:00
Alejandro Alonso
cbe51fcabd Merge remote-tracking branch 'origin/staging' into develop 2022-10-04 16:00:46 +02:00
Andrey Antukh
fadb1dfba6 Merge remote-tracking branch 'origin/staging' into develop 2022-10-04 15:33:00 +02:00
Andrés Moya
eb7f93d2e6 🐛 Make restore component work in external libraries 2022-10-04 15:26:14 +02:00
Andrey Antukh
fc01acffc7 Merge pull request #2379 from penpot/hiru-gc-deleted-comp
 Add garbage collect of deleted components
2022-10-04 13:59:53 +02:00
Andrés Moya
687e4dce2a 🔧 Adjust code style 2022-10-04 13:50:23 +02:00
Andrés Moya
c5b875c925 Add garbage collect of deleted components 2022-10-04 13:50:23 +02:00
Andrey Antukh
a08b9adeee 📎 Fix merge issues from staging to develop 2022-10-04 13:31:06 +02:00
Andrey Antukh
c2158b0f3c Merge remote-tracking branch 'origin/staging' into develop 2022-10-04 13:29:03 +02:00
Andrey Antukh
97c36ce86c Merge pull request #2405 from penpot/eva-fix-colorpicker-library
🐛 Fix color bullets in colorpicker modal on libraries
2022-10-04 13:17:39 +02:00
Eva
b41ca75512 🐛 Fix color bullets in colorpicker modal on libraries 2022-10-04 13:15:36 +02:00
Pablo Alba
5bbfe376cf Merge pull request #2390 from penpot/hiru-delete-comp-assets
🐛 Fix delete component from assets panel in v2
2022-10-04 11:47:28 +02:00
Andrey Antukh
c9ba5ff31e Merge branch 'schalkneethling-2395-switch-to-issue-forms' into develop 2022-10-04 10:44:50 +02:00
Schalk Neethling
0c1d04919f 📎 Switch to issue forms (on github)
GitHub launched issue template forms some time ago. These have helped other open source projects I have been involved in. As you can make certain fields required, it also helps cut down on issue spam.

fix #2395
2022-10-04 10:44:18 +02:00
Andrey Antukh
746f492632 Merge branch 'luzpaz-typos' into develop 2022-10-04 10:41:32 +02:00
luz paz
e30bea0b6f 🔧 Fix typos in source code
Found via `codespell -q 3 -S *.po,./frontend/yarn.lock -L childs,clen,fpr,inflight,ody,ot,ro,te,trys,ue`
2022-10-04 10:40:34 +02:00
Andrey Antukh
ac4218a3c2 Merge branch 'Vesther-fix/feedback-twitter-link' into develop 2022-10-04 10:37:06 +02:00
Kevin Nowald
0680d25fd7 🐛 Fix twitter feedback link
Refers to unexisting PenpotSupport Twitter account

Signed-off-by: Kevin Nowald <kevin@nowald.pl>
2022-10-04 10:35:48 +02:00
Alejandro Alonso
4a3a181403 Merge branch 'lol768-patch-1' into develop 2022-10-04 10:33:04 +02:00
Adam Williams
9ae40b392f 🐛 Fix word-break on comments, changed to break-word 2022-10-04 10:32:05 +02:00
Denys M
8c20890c7b 🌐 Add translations for: Ukrainian (ukr_UA).
Currently translated at 7.5% (91 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ukr_UA/
2022-10-01 16:17:53 +02:00
Zvonimir Juranko
dc863e8b97 🌐 Add translations for: Croatian.
Currently translated at 79.9% (967 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/hr/
2022-10-01 16:17:52 +02:00
K.B.Dharun Krishna
44241ada56 🌐 Add translations for: Tamil.
Currently translated at 0.5% (7 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ta/
2022-10-01 16:17:48 +02:00
Jacopo Lodovico Trabia
f9b7235f8b 🌐 Add translations for: Italian.
Currently translated at 43.0% (520 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-10-01 16:17:48 +02:00
Ahmad HosseinBor
cc68eaa9f7 🌐 Add translations for: Persian.
Currently translated at 55.1% (667 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fa/
2022-10-01 16:17:47 +02:00
Shuaib Zahda
af640234b5 🌐 Add translations for: Arabic.
Currently translated at 58.3% (706 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-10-01 16:17:47 +02:00
Youkho
3c1ab1d58a 🌐 Add translations for: Arabic.
Currently translated at 58.3% (706 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-10-01 16:17:47 +02:00
Andrés Moya
243e29fdb4 🐛 Fix delete component from assets panel in v2 2022-09-30 15:26:29 +02:00
Eva
a4bbb43555 🐛 Fix shortcut texts alignment 2022-09-30 13:31:27 +02:00
Alejandro Alonso
98f490703f 🌐 Added translation for: Ukrainian (ukr_UA). 2022-09-30 07:16:21 +02:00
Alejandro Alonso
8f786407af Merge remote-tracking branch 'origin/staging' into develop 2022-09-29 14:22:05 +02:00
Andrey Antukh
fd6d72128b Merge remote-tracking branch 'origin/staging' into develop 2022-09-29 10:37:13 +02:00
elhombretecla
f470efc9c7 🎉 Add new contributors to thankyou.md 2022-09-29 09:28:47 +02:00
Andrey Antukh
a59a4d9891 Merge remote-tracking branch 'origin/staging' into develop 2022-09-29 08:40:07 +02:00
Andrey Antukh
2a55d2ebdb Merge remote-tracking branch 'origin/staging' into develop 2022-09-28 15:36:12 +02:00
Alejandro Alonso
43ceb6bb44 Merge remote-tracking branch 'origin/main' into develop 2022-09-28 14:30:28 +02:00
Alejandro
a5b36fd3f8 Merge pull request #2365 from penpot/eva-fix-texts
🐛 Fix some texts and a typo
2022-09-28 14:25:53 +02:00
Eva Marco
41d5a490d4 Merge pull request #2363 from penpot/palba-accesibility-elements-without-name
 Fix elements-without-names and html-lang-missing accesibility issues
2022-09-28 13:41:57 +02:00
Pablo Alba
6a329fac27 Fix html-lang-missing accesibility issues 2022-09-28 13:16:36 +02:00
Pablo Alba
4ef876bf58 Fix elements-without-names accesibility issues 2022-09-28 12:47:33 +02:00
Andrey Antukh
7303d311d5 Merge remote-tracking branch 'origin/staging' into develop 2022-09-28 12:16:06 +02:00
Eva
1a7583e6ad 🐛 Fix some texts and a typo 2022-09-28 09:42:08 +02:00
andy
f2de69e1f3 🌐 Added translation for: Croatian. 2022-09-27 17:35:28 +02:00
andy
2030e845bb 🌐 Added translation for: Tamil. 2022-09-27 17:32:55 +02:00
Dário
b1edc53a1c 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-27 17:18:10 +02:00
Youkho
4d56b5f1b9 🌐 Add translations for: Arabic.
Currently translated at 53.1% (642 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-09-27 17:18:10 +02:00
Oğuz Ersen
e1960b4472 🌐 Add translations for: Turkish.
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/tr/
2022-09-27 17:18:09 +02:00
Alejandro
86993c0e21 Merge pull request #2360 from penpot/eva-a11y-add-title
 Add title to color bullets
2022-09-27 13:54:31 +02:00
Alejandro Alonso
c373b3741f 📎 Increase version 2022-09-27 13:02:43 +02:00
Eva
494b08b975 Add title to color bullets 2022-09-27 12:57:20 +02:00
Hugo Figueira
161a139194 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-25 01:03:26 +02:00
Shuaib Zahda
407423b480 🌐 Add translations for: Arabic.
Currently translated at 52.6% (637 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-09-25 01:03:25 +02:00
Hugo Figueira
aadc3c25db 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-25 01:03:25 +02:00
Hugo Figueira
26b32634f7 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-24 10:30:23 +02:00
Jaziel Cavalcante
836511f5c7 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-24 10:30:23 +02:00
Hugo Figueira
043683775f 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-23 22:54:31 +02:00
Dário
f6792ce67f 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-23 22:54:31 +02:00
Dário
84760f940c 🌐 Add translations for: Indonesian.
Currently translated at 9.3% (113 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/id/
2022-09-23 22:54:30 +02:00
Stas Haas
4faa3db6f8 🌐 Add translations for: German.
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-09-23 22:54:30 +02:00
Filipe Pessanha
71f2e4cabe 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 99.1% (1199 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-23 22:54:30 +02:00
Jaziel Cavalcante
96ef9a3c52 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 99.1% (1199 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-23 22:54:29 +02:00
Hugo Figueira
83f734977f 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 99.1% (1199 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-23 22:54:29 +02:00
Stas Haas
9d02bbcc1c 🌐 Add translations for: Russian.
Currently translated at 66.5% (805 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ru/
2022-09-23 22:54:27 +02:00
Dário
f4264e47f0 🌐 Add translations for: Spanish.
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/es/
2022-09-23 22:54:27 +02:00
Dário
0fa8f54ce4 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 99.0% (1198 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-23 16:27:47 +02:00
Hugo Figueira
920cb86849 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 62.7% (759 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-23 16:27:45 +02:00
Dário
706bf86c95 🌐 Add translations for: Portuguese (Portugal).
Currently translated at 64.9% (785 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_PT/
2022-09-23 13:07:40 +02:00
Vik
fbaa19d405 🌐 Add translations for: Russian.
Currently translated at 66.5% (804 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ru/
2022-09-23 13:06:07 +02:00
Valentina Chapellu
69ab9e9696 🌐 Add translations for: Italian.
Currently translated at 37.7% (456 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-09-22 23:21:44 +02:00
Jacopo Lodovico Trabia
d5fea6100d 🌐 Add translations for: Italian.
Currently translated at 37.7% (456 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-09-22 23:21:44 +02:00
nautilusx
47ba8383e8 🌐 Add translations for: German.
Currently translated at 99.6% (1205 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-09-22 23:21:43 +02:00
Ruan Aragão
48e6cc5a6b 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 60.6% (733 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-22 23:21:42 +02:00
Mateus Muller
da5fabbc66 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 60.6% (733 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-22 23:21:42 +02:00
Vik
691a9fa877 🌐 Add translations for: Russian.
Currently translated at 65.8% (796 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ru/
2022-09-22 23:21:42 +02:00
andy
2667e515f7 🌐 Added translation for: Portuguese (Portugal). 2022-09-22 16:43:43 +02:00
andy
f1552e4091 🌐 Added translation for: Finnish. 2022-09-22 16:40:29 +02:00
Valentina Chapellu
e2b39c0680 🌐 Add translations for: Italian.
Currently translated at 32.5% (394 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-09-20 15:46:24 +02:00
Jacopo Lodovico Trabia
58d604a20a 🌐 Add translations for: Italian.
Currently translated at 32.5% (394 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/it/
2022-09-20 15:46:24 +02:00
Shuaib Zahda
dc7e252972 🌐 Add translations for: Arabic.
Currently translated at 48.6% (588 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ar/
2022-09-20 15:46:23 +02:00
Filipe Pessanha
4433c1136c 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 59.3% (717 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-20 15:46:23 +02:00
Semon Xue
0dbefcc401 🌐 Add translations for: Chinese (Simplified).
Currently translated at 99.9% (1208 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/zh_Hans/
2022-09-20 15:46:22 +02:00
Vik
051a65c346 🌐 Add translations for: Russian.
Currently translated at 62.2% (752 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/ru/
2022-09-20 15:46:22 +02:00
Swapnil C
449a6c9127 🌐 Add translations for: French.
Currently translated at 88.8% (1074 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fr/
2022-09-20 15:46:20 +02:00
andy
f72b94ac9b 🌐 Added translation for: Czech. 2022-09-20 12:34:09 +02:00
Semon Xue
9a4ad38957 🌐 Add translations for: Chinese (Simplified).
Currently translated at 92.5% (1119 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/zh_Hans/
2022-09-19 11:17:40 +02:00
Ahmad HosseinBor
1019a037d8 🌐 Add translations for: Persian.
Currently translated at 54.9% (664 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fa/
2022-09-19 06:15:22 +02:00
liimee
9af04c8fbb 🌐 Add translations for: Indonesian.
Currently translated at 9.3% (113 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/id/
2022-09-19 06:15:22 +02:00
Mikel Larreategi
9d63bc99bf 🌐 Add translations for: Basque.
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/eu/
2022-09-16 23:18:21 +02:00
Eranot
516735cd0b 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 58.7% (710 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-16 23:18:21 +02:00
Ally Tiago
4497d8842a 🌐 Add translations for: Portuguese (Brazil).
Currently translated at 58.7% (710 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/pt_BR/
2022-09-16 23:18:21 +02:00
Beeby Xia
be5a232994 🌐 Add translations for: Chinese (Simplified).
Currently translated at 92.4% (1118 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/zh_Hans/
2022-09-14 16:15:35 +02:00
Beeby Xia
81a4c6b3f1 🌐 Add translations for: Chinese (Simplified).
Currently translated at 90.4% (1094 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/zh_Hans/
2022-09-13 15:19:14 +02:00
Stas Haas
4027241bc0 🌐 Add translations for: German.
Currently translated at 99.0% (1197 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-09-12 15:20:09 +02:00
Mikel Larreategi
8ac1dfce29 🌐 Add translations for: Basque.
Currently translated at 97.4% (1178 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/eu/
2022-09-10 19:20:27 +02:00
Ahmad HosseinBor
8e0e77fd3c 🌐 Add translations for: Persian.
Currently translated at 54.6% (661 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fa/
2022-09-10 19:20:26 +02:00
Yaron Shahrabani
5b92dca270 🌐 Add translations for: Hebrew.
Currently translated at 100.0% (1209 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/he/
2022-09-10 19:20:26 +02:00
liimee
5454cabf98 🌐 Add translations for: Indonesian.
Currently translated at 7.8% (95 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/id/
2022-09-10 19:20:25 +02:00
Stas Haas
e4e0deeb1c 🌐 Add translations for: German.
Currently translated at 98.9% (1196 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/de/
2022-09-10 19:20:25 +02:00
Beeby Xia
d47d687b43 🌐 Add translations for: Chinese (Simplified).
Currently translated at 89.9% (1088 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/zh_Hans/
2022-09-10 19:20:24 +02:00
Philippe Accorsi
0595d6b88d 🌐 Add translations for: French.
Currently translated at 86.3% (1044 of 1209 strings)

Translation: Penpot/frontend
Translate-URL: https://hosted.weblate.org/projects/penpot/frontend/fr/
2022-09-10 19:20:23 +02:00
550 changed files with 44652 additions and 13512 deletions

View File

@@ -3,19 +3,19 @@ jobs:
build:
docker:
- image: penpotapp/devenv:latest
- image: cimg/postgres:13.5
- image: cimg/postgres:14.5
environment:
POSTGRES_USER: penpot_test
POSTGRES_PASSWORD: penpot_test
POSTGRES_DB: penpot_test
- image: cimg/redis:6.2.6
- image: cimg/redis:7.0.5
working_directory: ~/repo
resource_class: large
environment:
# Customize the JVM maximum heap limit
JVM_OPTS: -Xmx1g
JVM_OPTS: -Xmx4g
steps:
- checkout
@@ -29,6 +29,13 @@ jobs:
- run: cd .clj-kondo && cat config.edn
- run:
name: frontend styles prettier
working_directory: "./frontend"
command: |
yarn install
yarn run lint-scss
- run:
name: common lint
working_directory: "./common"
@@ -43,13 +50,6 @@ jobs:
clj-kondo --version
clj-kondo --parallel --lint src/
- run:
name: frontend styles prettier
working_directory: "./frontend"
command: |
yarn install
yarn run lint-scss
- run:
name: backend lint
working_directory: "./backend"
@@ -57,47 +57,42 @@ jobs:
clj-kondo --version
clj-kondo --parallel --lint src/
# run backend test
- run:
working_directory: "./common"
name: common tests
command: |
yarn install
yarn test
clojure -X:dev:test :patterns '["common-tests.*-test"]'
environment:
PATH: /usr/local/nodejs/bin/:/usr/local/bin:/bin:/usr/bin
JVM_OPTS: -Xmx4g
NODE_OPTIONS: --max-old-space-size=4096
- run:
name: backend test
working_directory: "./backend"
command: "clojure -X:dev:test"
command: |
clojure -X:dev:test :patterns '["backend-tests.*-test"]'
environment:
PENPOT_TEST_DATABASE_URI: "postgresql://localhost/penpot_test"
PENPOT_TEST_DATABASE_USERNAME: penpot_test
PENPOT_TEST_DATABASE_PASSWORD: penpot_test
PENPOT_TEST_REDIS_URI: "redis://localhost/1"
JVM_OPTS: -Xmx4g
- run:
name: frontend tests
working_directory: "./frontend"
command: |
yarn install
clojure -M:dev:shadow-cljs compile test
node target/tests.js
environment:
PATH: /usr/local/nodejs/bin/:/usr/local/bin:/bin:/usr/bin
- run:
working_directory: "./common"
name: common tests (cljs)
command: |
yarn install
yarn run compile-test
node target/test.js
environment:
PATH: /usr/local/nodejs/bin/:/usr/local/bin:/bin:/usr/bin
- run:
working_directory: "./common"
name: common tests (clj)
command: |
clojure -X:dev:test
yarn test
environment:
PATH: /usr/local/nodejs/bin/:/usr/local/bin:/bin:/usr/bin
NODE_OPTIONS: --max-old-space-size=4096
- save_cache:
paths:

View File

@@ -7,6 +7,7 @@
app.common.data/export clojure.core/def
app.db/with-atomic clojure.core/with-open
app.common.data.macros/get-in clojure.core/get-in
app.common.data.macros/with-open clojure.core/with-open
app.common.data.macros/select-keys clojure.core/select-keys
app.common.logging/with-context clojure.core/do}

89
.github/ISSUE_TEMPLATE/bug-report.yml vendored Normal file
View File

@@ -0,0 +1,89 @@
description: Create a report to help us improve
labels: ["bug"]
name: Bug report
title: "bug: "
body:
- type: markdown
attributes:
value: |
## Before you start
Please search our [existing issues](https://github.com/penpot/penpot/issues) and open [pull requests](https://github.com/penpot/penpot/pulls) to lessen the change of filing duplicate issues or feature requests. Thank you.
---
- type: textarea
attributes:
label: Steps To Reproduce
description: Steps to reproduce the behavior.
placeholder: |
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
validations:
required: true
- type: textarea
id: expected
attributes:
description: A clear and concise description of what you expected to happen.
label: Expected behavior
validations:
required: true
- type: textarea
id: actual
attributes:
description: A clear and concise description of what happens instead; what the bug is.
label: Actual behavior
validations:
required: true
- type: textarea
id: screenshots
attributes:
description: If applicable, add screenshots to help explain your problem.
label: Screenshots or video
- type: textarea
id: desktop
attributes:
label: Desktop (please complete the following information)
placeholder: |
- OS (e.g. iOS):
- Browser & version (e.g. Chrome 89.0):
- type: textarea
id: mobile
attributes:
label: Smartphone (please complete the following information)
placeholder: |
- Device & model (e.g. iPhone 6):
- OS & version (e.g. iOS 8.1):
- Browser & version (e.g. stock browser 22):
- type: textarea
id: environment
attributes:
label: Environment (please complete the following information)
placeholder: |
- Host (e.g. https://design.penpot.app, local instance):
*If self-hosted:*
- OS Version (e.g. Ubuntu 16.04):
- Docker / Docker-compose version (e.g. Docker version 18.03.0-ce, build 0520e24):
- Image version (e.g. Alpine):
Docker commands or docker-compose file (if possible and if proceed.x):
```
```
- type: textarea
id: frontend-trace
attributes:
label: Frontend Stack Trace
render: console
- type: textarea
id: backend-trace
attributes:
label: Backend Stack Trace
render: console
- type: textarea
id: additional-context
attributes:
label: Additional context
description: Any other context about the problem.

View File

@@ -1,72 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
assignees: ''
---
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
**Expected behavior**
A clear and concise description of what you expected to happen.
**Actual behavior**
A clear and concise description of what happens instead; what the bug is.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS (e.g. iOS):
- Browser & version (e.g. Chrome 89.0):
**Smartphone (please complete the following information):**
- Device & model (e.g. iPhone 6):
- OS & version (e.g. iOS 8.1):
- Browser & version (e.g. stock browser 22):
**Environment (please complete the following information):**
- Host (e.g. https://design.penpot.app, local instance):
*If self-hosted:*
- OS Version (e.g. Ubuntu 16.04):
- Docker / Docker-compose version (e.g. Docker version 18.03.0-ce, build 0520e24):
- Image version (e.g. Alpine):
Docker commands or docker-compose file (if possible and if proceed.x):
```
```
Frontend Stack Trace:
<details>
```
```
</details>
Backend Stack Trace:
<details>
```
```
</details>
**Additional context:**
Any other context about the problem.

View File

@@ -0,0 +1,37 @@
description: Suggest an idea for this project.
labels: ["needs triage", "enhancement"]
name: "Feature request"
title: "feature: "
body:
- type: markdown
attributes:
value: |
## Before you start
Please search our [existing issues](https://github.com/penpot/penpot/issues) and open [pull requests](https://github.com/penpot/penpot/pulls) to lessen the change of filing duplicate issues or feature requests. Thank you.
---
- type: textarea
id: problem
attributes:
description: A clear and concise description of what the problem is. Ex. I'm always frustrated when (...)
label: Is your feature request related to a problem? Please describe.
validations:
required: true
- type: textarea
id: solution
attributes:
description: A clear and concise description of what you want to happen.
label: Describe the solution you'd like.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Describe alternatives you've considered.
description: A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
id: additional-context
attributes:
label: Additional context
description: Add any other context or screenshots about the feature request here.

View File

@@ -1,21 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when (...)
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

34
.gitignore vendored
View File

@@ -1,55 +1,59 @@
*-init.clj
*.jar
*.penpot
*.orig
*.penpot
.calva
.clj-kondo
.cpcache
.lein-deps-sum
.lein-failures
.lein-plugins/
.lein-repl-history
.lsp
.nrepl-port
.nyc_output
.rebel_readline_history
.repl
.shadow-cljs
/*.jpg
/*.md
/*.png
/*.sql
/*.txt
/*.yml
/*.zip
/.clj-kondo/.cache
/_dump
/backend/-
/backend/*.md
/backend/*.sql
/backend/*.txt
/backend/assets/
/backend/builtin-templates
/backend/dist/
/backend/logs/
/backend/resources/public/assets
/backend/resources/public/media
/backend/target/
/backend/builtin-templates
/bundle*
/cd.md
/clj-profiler/
/common/.shadow-cljs
/common/coverage
/common/target
/deploy
/docker/images/bundle*
/exporter/.shadow-cljs
/exporter/target
/frontend/.shadow-cljs
/frontend/package-lock.json
/frontend/cypress/videos/*/
/frontend/cypress/fixtures/validuser.json
/frontend/cypress/videos/*/
/frontend/cypress/videos/*/
/frontend/dist/
/frontend/npm-debug.log
/frontend/out/
/frontend/package-lock.json
/frontend/resources/fonts/experiments
/frontend/resources/public/*
/frontend/target/
/frontend/cypress/videos/*/
/media
/other/
/scripts/
/telemetry/
/tmp/
/vendor/**/target
/vendor/svgclean/bundle*.js
/web
clj-profiler/
figwheel_server.log
node_modules

View File

@@ -1,5 +1,27 @@
# CHANGELOG
## :rocket: Next (1.17)
### :boom: Breaking changes & Deprecations
### :sparkles: New features
- Adds layout flex functionality for boards
- Better overlays interactions on boards inside boards [Taiga #4386](https://tree.taiga.io/project/penpot/us/4386)
- Show board miniature in manual overlay setting [Taiga #4475](https://tree.taiga.io/project/penpot/issue/4475)
- Handoff visual improvements [Taiga #3124](https://tree.taiga.io/project/penpot/us/3124)
### :bug: Bugs fixed
- Add title to color bullets [Taiga #4218](https://tree.taiga.io/project/penpot/task/4218)
- Fix color bullets in library color modal [Taiga #4186](https://tree.taiga.io/project/penpot/issue/4186)
- Fix shortcut texts alignment [Taiga #4275](https://tree.taiga.io/project/penpot/issue/4275)
- Fix some texts and a typo [Taiga #4215](https://tree.taiga.io/project/penpot/issue/4215)
- Fix twitter support account link [Taiga #4279](https://tree.taiga.io/project/penpot/issue/4279)
- Fix lang autodetect issue [Taiga #4277](https://tree.taiga.io/project/penpot/issue/4277)
- Fix adding an extra page on import [Taiga #4543](https://tree.taiga.io/project/penpot/task/4543)
### :arrow_up: Deps updates
### :heart: Community contributions by (Thank you!)
## 1.16.2-beta
## 1.16.1-beta
@@ -21,6 +43,7 @@
- Fix bad behaviour on hovering and click nested artboards [Taiga #4018](https://tree.taiga.io/project/penpot/issue/4018) and [Taiga #4269](https://tree.taiga.io/project/penpot/us/4269)
- Fix lang autodetect issue [Taiga #4277](https://tree.taiga.io/project/penpot/issue/4277)
- Fix colorpicker does not close upon switching to Dashboard [Taiga #4408](https://tree.taiga.io/project/penpot/issue/4408)
- Fix problem with auto-width/auto-height + lock-proportions
## 1.16.0-beta
@@ -28,7 +51,7 @@
- Removed the support for v2 internal file data blob format. This
version has never been documented nor set as default value so
technicaly this is not a breaking change because we are removing
technically this is not a breaking change because we are removing
a "private API".
### :sparkles: New features
@@ -73,6 +96,7 @@
- To @andrewzhurov for many code contributions on this release.
- UI improvements in Project section (by @Waishnav) [#2285](https://github.com/penpot/penpot/pull/2285)
- Fix fronted comments (by @lol768) [#2368](https://github.com/penpot/penpot/pull/2368)
## 1.15.5-beta
@@ -164,7 +188,7 @@
- The `PENPOT_LDAP_ATTRS_PHOTO` finally removed, it was unused for many
versions.
- If you are using social login (google, github, gitlab or generic OIDC) you
will need to ensure to add the following flags respectivelly to let them
will need to ensure to add the following flags respectively to let them
enabled: `enable-login-with-google`, `enable-login-with-github`,
`enable-login-with-gitlab` and `enable-login-with-oidc`. If not, they will
remain disabled after application start independently if you set the client-id
@@ -269,7 +293,7 @@
- Fix undo when drawing curves [Taiga #3523](https://tree.taiga.io/project/penpot/issue/3523)
- Fix issue with text edition and certain fonts (WorkSans, Raleway, ...) and foreign objects [Taiga #3521](https://tree.taiga.io/project/penpot/issue/3521)
- Fix thumbnail generation when concurrent edition [Taiga #3522](https://tree.taiga.io/project/penpot/issue/3522)
- Fix environment imporot for exporter in Docker
- Fix environment import for exporter in Docker
- Fix auto scroll layers in Firefox [Taiga #3531](https://tree.taiga.io/project/penpot/issue/3531)
- Fix base background not visible for imported SVG
@@ -353,7 +377,7 @@
- Fix mouse leave in handoff close overlay animation breaks [Taiga #3173](https://tree.taiga.io/project/penpot/issue/3173)
- Fix different behaviour during image drag [Taiga #2279](https://tree.taiga.io/project/penpot/issue/2279)
- Fix hidden file name on import [Taiga #3172](https://tree.taiga.io/project/penpot/issue/3172)
- Fix unneccessary scrollbars at the color list [Taiga #3211](https://tree.taiga.io/project/penpot/issue/3211)
- Fix unnecessary scrollbars at the color list [Taiga #3211](https://tree.taiga.io/project/penpot/issue/3211)
- "Show in exports" is showing in multiselections [Taiga #3194](https://tree.taiga.io/project/penpot/issue/3194)
- Edit file name navigates to the file workspace [Taiga #3183](https://tree.taiga.io/project/penpot/issue/3183)
- Fix scroll into view behind fixed element [Taiga #3170](https://tree.taiga.io/project/penpot/issue/3170)
@@ -362,7 +386,7 @@
- Fix duplicate multi selected elements [Taiga #3155](https://tree.taiga.io/project/penpot/issue/3155)
- Fix add fills to artboard modify children [Taiga #3151](https://tree.taiga.io/project/penpot/issue/3151)
- Avoid numeric inputs to allow big numbers [Taiga #2858](https://tree.taiga.io/project/penpot/issue/2858)
- Fix component contex menu size [Taiga #2480](https://tree.taiga.io/project/penpot/issue/2480)
- Fix component context menu size [Taiga #2480](https://tree.taiga.io/project/penpot/issue/2480)
- Add shadow to artboard make it lose the fill [Taiga #3139](https://tree.taiga.io/project/penpot/issue/3139)
- Avoid numeric inputs to change its value without focusing them [Taiga #3140](https://tree.taiga.io/project/penpot/issue/3140)
- Fix comments modal when changing pages [Taiga #2597](https://tree.taiga.io/project/penpot/issue/2508)
@@ -491,7 +515,7 @@
- Fix issue on handling empty content on boolean shapes
- Fix race condition issue on component renaming
- Handle EOF errors on writting streamed response
- Handle EOF errors on writing streamed response
- Handle EOF errors on websocket send/ping methods
- Disable parallel upload of file media on import (causes too much
contention on the rlimit subsistem that does not works as expected
@@ -603,7 +627,7 @@
## 1.10.4-beta
### :sparkles: Enhacements
### :sparkles: Enhancements
- Allow parametrice file snapshoting interval
@@ -615,7 +639,7 @@
## 1.10.3-beta
### :sparkles: Enhacements
### :sparkles: Enhancements
- Make all logging asynchronous, this avoid some overhead on jetty threads at cost of logging latency.
- Increase default session time to 15 days.
@@ -951,7 +975,7 @@
- Add better auth module logging.
- Add missing `email` scope to OIDC backend.
- Add missing cause prop on error loging.
- Add missing cause prop on error logging.
- Fix empty font-family handling on custom fonts page.
- Fix incorrect unicode code points handling on draft-to-penpot conversion.
- Fix some problems with paths.

View File

@@ -99,7 +99,7 @@ Each commit should have:
- An entry on the CHANGES.md file if applicable, referencing the
github or taiga issue/user-story using the these same rules.
Examples of good commit messags:
Examples of good commit messages:
- :bug: Fix unexpected error on launching modal
- :bug: Set proper error message on generic error

View File

@@ -85,4 +85,13 @@ We want to thank to the amazing people that help us! Thank you! You're the best!
* [Yaron](https://hosted.weblate.org/user/Yaron)
* [yrd](https://hosted.weblate.org/user/yrd)
* [YukiYuigishi](https://hosted.weblate.org/user/YukiYuigishi)
* [zcraber](https://hosted.weblate.org/user/zcraber)
* [zcraber](https://hosted.weblate.org/user/zcraber)
## Libraries & templates
* systxema
* plumilla
* victor crespo
* xtech
* candidexmedia
* merih güz
* klarr agency

View File

@@ -16,19 +16,18 @@
:exclusions [org.eclipse.jetty/jetty-server
org.eclipse.jetty/jetty-servlet]}
io.prometheus/simpleclient_httpserver {:mvn/version "0.16.0"}
io.lettuce/lettuce-core {:mvn/version "6.2.0.RELEASE"}
io.lettuce/lettuce-core {:mvn/version "6.2.1.RELEASE"}
java-http-clj/java-http-clj {:mvn/version "0.4.3"}
funcool/yetti
{:git/tag "v9.9"
:git/sha "f0a455d"
{:git/tag "v9.11"
:git/sha "6f9197a"
:git/url "https://github.com/funcool/yetti.git"
:exclusions [org.slf4j/slf4j-api]}
com.github.seancorfield/next.jdbc {:mvn/version "1.3.828"}
com.github.seancorfield/next.jdbc {:mvn/version "1.3.834"}
metosin/reitit-core {:mvn/version "0.5.18"}
org.postgresql/postgresql {:mvn/version "42.5.0"}
com.zaxxer/HikariCP {:mvn/version "5.0.1"}
@@ -38,6 +37,8 @@
buddy/buddy-hashers {:mvn/version "1.8.158"}
buddy/buddy-sign {:mvn/version "3.4.333"}
com.github.ben-manes.caffeine/caffeine {:mvn/version "3.1.1"}
org.jsoup/jsoup {:mvn/version "1.15.1"}
org.im4java/im4java
{:git/tag "1.4.0-penpot-2"
@@ -62,7 +63,6 @@
{:extra-deps
{com.bhauman/rebel-readline {:mvn/version "RELEASE"}
org.clojure/tools.namespace {:mvn/version "RELEASE"}
org.clojure/test.check {:mvn/version "RELEASE"}
clojure-humanize/clojure-humanize {:mvn/version "0.2.2"}
org.clojure/data.csv {:mvn/version "RELEASE"}
com.clojure-goes-fast/clj-async-profiler {:mvn/version "RELEASE"}

View File

@@ -12,9 +12,12 @@
[app.common.logging :as l]
[app.common.perf :as perf]
[app.common.pprint :as pp]
[app.common.spec :as us]
[app.common.transit :as t]
[app.common.uuid :as uuid]
[app.config :as cfg]
[app.main :as main]
[app.srepl.helpers]
[app.srepl.main :as srepl]
[app.util.blob :as blob]
[app.util.fressian :as fres]
@@ -26,10 +29,13 @@
[clojure.pprint :refer [pprint print-table]]
[clojure.repl :refer :all]
[clojure.spec.alpha :as s]
[clojure.spec.gen.alpha :as sgen]
[clojure.stacktrace :as trace]
[clojure.test :as test]
[clojure.test.check.generators :as gen]
[clojure.tools.namespace.repl :as repl]
[clojure.walk :refer [macroexpand-all]]
[criterium.core :as crit]
[cuerdas.core :as str]
[datoteka.core]
[integrant.core :as ig]))
@@ -42,24 +48,24 @@
(defmacro run-quick-bench
[& exprs]
`(with-progress-reporting (quick-bench (do ~@exprs) :verbose)))
`(crit/with-progress-reporting (crit/quick-bench (do ~@exprs) :verbose)))
(defmacro run-quick-bench'
[& exprs]
`(quick-bench (do ~@exprs)))
`(crit/quick-bench (do ~@exprs)))
(defmacro run-bench
[& exprs]
`(with-progress-reporting (bench (do ~@exprs) :verbose)))
`(crit/with-progress-reporting (crit/bench (do ~@exprs) :verbose)))
(defmacro run-bench'
[& exprs]
`(bench (do ~@exprs)))
`(crit/bench (do ~@exprs)))
;; --- Development Stuff
(defn- run-tests
([] (run-tests #"^app.*-test$"))
([] (run-tests #"^backend-tests.*-test$"))
([o]
(repl/refresh)
(cond
@@ -74,19 +80,22 @@
(defn- start
[]
(alter-var-root #'system (fn [sys]
(when sys (ig/halt! sys))
(-> (merge main/system-config main/worker-config)
(ig/prep)
(ig/init))))
:started)
(try
(alter-var-root #'system (fn [sys]
(when sys (ig/halt! sys))
(-> (merge main/system-config main/worker-config)
(ig/prep)
(ig/init))))
:started
(catch Throwable cause
(ex/print-throwable cause))))
(defn- stop
[]
(alter-var-root #'system (fn [sys]
(when sys (ig/halt! sys))
nil))
:stoped)
:stopped)
(defn restart
[]
@@ -100,12 +109,20 @@
(defn compression-bench
[data]
(let [humanize (fn [v] (hum/filesize v :binary true :format " %.4f "))]
(let [humanize (fn [v] (hum/filesize v :binary true :format " %.4f "))
v1 (time (humanize (alength (blob/encode data {:version 1}))))
v3 (time (humanize (alength (blob/encode data {:version 3}))))
v4 (time (humanize (alength (blob/encode data {:version 4}))))
v5 (time (humanize (alength (blob/encode data {:version 5}))))
v6 (time (humanize (alength (blob/encode data {:version 6}))))
]
(print-table
[{:v1 (humanize (alength (blob/encode data {:version 1})))
:v2 (humanize (alength (blob/encode data {:version 2})))
:v3 (humanize (alength (blob/encode data {:version 3})))
:v4 (humanize (alength (blob/encode data {:version 4})))
[{
:v1 v1
:v3 v3
:v4 v4
:v5 v5
:v6 v6
}])))
(defonce debug-tap

View File

@@ -22,6 +22,10 @@
:name "Circum Icons pack"
:thumbnail-uri "https://penpot.app/images/libraries/cover-circum.jpg"
:file-uri "https://github.com/penpot/penpot-files/raw/binary-files/CircumIcons.penpot"}
{:id "coreui"
:name "CoreUI"
:thumbnail-uri "https://penpot.app/images/libraries/cover-coreui.jpg"
:file-uri "https://github.com/penpot/penpot-files/raw/main/CoreUI%20DesignSystem%20(DEMO).penpot"}
{:id "whiteboarding-kit"
:name "Whiteboarding Kit"
:thumbnail-uri "https://penpot.app/images/libraries/cover-whiteboards.jpg"

View File

@@ -77,7 +77,7 @@ Debug Main Page
<legend>Import binfile:</legend>
<desc>Import penpot file in binary
format. If <strong>overwrite</strong> is checked, all files will
be overwriten using the same ids found in the file instead of
be overwritten using the same ids found in the file instead of
generating a new ones.</desc>
<form method="post" enctype="multipart/form-data" action="/dbg/file/import">
@@ -90,7 +90,7 @@ Debug Main Page
<input type="checkbox" name="overwrite" />
<br />
<small>
Instead of creating a new file with all relations remaped,
Instead of creating a new file with all relations remapped,
reuses all ids and updates/overwrites the objects that are
already exists on the database.
<strong>Warning, this operation should be used with caution.</strong>
@@ -111,7 +111,7 @@ Debug Main Page
<input type="checkbox" name="ignore-index-errors" checked/>
<br />
<small>
Do not break on index lookup erros (remap operation).
Do not break on index lookup errors (remap operation).
Useful when importing a broken file that has broken
relations or missing pieces.
</small>

View File

@@ -11,7 +11,8 @@ penpot - error list
<main class="horizontal-list">
<ul>
{% for item in items %}
<li><a href="/dbg/error/{{item.id}}">{{item.created-at}}</a></li>
<li><a class="date" href="/dbg/error/{{item.id}}">{{item.created-at}}</a>
<span class="title">{{item.hint|abbreviate:150}}</span></li>
{% endfor %}
</ul>
</main>

View File

@@ -137,8 +137,6 @@ nav > div:not(:last-child) {
margin: 0px;
padding: 0px;
flex-direction: column;
flex-wrap: wrap;
height: calc(100vh - 75px);
justify-content: flex-start;
}
@@ -151,19 +149,31 @@ nav > div:not(:last-child) {
margin: 0px 20px;
cursor: pointer;
display: flex;
justify-content: center;
border-radius: 3px;
}
.horizontal-list li:hover {
background-color: #e9e9e9;
}
.horizontal-list li > *:not(:last-child) {
margin-right: 10px;
}
.horizontal-list li > a {
text-decoration: none;
color: inherit;
}
.horizontal-list li > .date {
font-weight: 200;
color: #686868;
min-width: 210px;
}
form .row {
padding: 5px 0;
}

View File

@@ -0,0 +1,9 @@
;; Example climit.edn file
;; Required: concurrency
;; Optional: queue-size, ommited means Integer/MAX_VALUE
{:update-file {:concurrency 1 :queue-size 3}
:auth {:concurrency 128}
:process-font {:concurrency 4 :queue-size 32}
:process-image {:concurrency 8 :queue-size 32}
:push-audit-events
{:concurrency 1 :queue-size 3}}

View File

@@ -2,11 +2,13 @@
<Configuration status="info" monitorInterval="30">
<Appenders>
<Console name="console" target="SYSTEM_OUT">
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"/>
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"
alwaysWriteExceptions="false" />
</Console>
<RollingFile name="main" fileName="logs/main.log" filePattern="logs/main-%i.log">
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"/>
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"
alwaysWriteExceptions="false" />
<Policies>
<SizeBasedTriggeringPolicy size="50M"/>
</Policies>
@@ -32,6 +34,8 @@
<Logger name="app.util.websocket" level="info" />
<Logger name="app.redis" level="info" />
<Logger name="app.rpc.rlimit" level="info" />
<Logger name="app.rpc.climit" level="info" />
<Logger name="app.rpc.mutations.files" level="info" />
<Logger name="app.cli" level="debug" additivity="false">
<AppenderRef ref="console"/>

View File

@@ -2,7 +2,8 @@
<Configuration status="info" monitorInterval="60">
<Appenders>
<Console name="console" target="SYSTEM_OUT">
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"/>
<PatternLayout pattern="[%d{YYYY-MM-dd HH:mm:ss.SSS}] %level{length=1} %logger{36} - %msg%n"
alwaysWriteExceptions="false" />
</Console>
</Appenders>

View File

@@ -1,6 +1,10 @@
;; Example rlimit.edn file
^{:refresh "30s"}
{:default
[[:default :window "200000/h"]]
#{:query/teams}
[[:burst :bucket "5/1/5s"]]
#{:query/profile}
[[:burst :bucket "100/60/1m"]]}

4
backend/scripts/kill-repl.sh Executable file
View File

@@ -0,0 +1,4 @@
#!/usr/bin/env bash
set -x
jcmd |grep "rebel" |sed -nE 's/^([0-9]+).*$/\1/p' | xargs kill -9

View File

@@ -2,7 +2,7 @@
export PENPOT_HOST=devenv
export PENPOT_TENANT=dev
export PENPOT_FLAGS="$PENPOT_FLAGS enable-backend-asserts enable-audit-log enable-transit-readable-response enable-demo-users disable-secure-session-cookies enable-rpc-rate-limit enable-warn-rpc-rate-limits enable-smtp"
export PENPOT_FLAGS="$PENPOT_FLAGS enable-backend-asserts enable-audit-log enable-transit-readable-response enable-demo-users disable-secure-session-cookies enable-smtp"
# export PENPOT_DATABASE_URI="postgresql://172.17.0.1:5432/penpot"
# export PENPOT_DATABASE_USERNAME="penpot"
@@ -16,8 +16,6 @@ export PENPOT_FLAGS="$PENPOT_FLAGS enable-backend-asserts enable-audit-log enabl
# export PENPOT_LOGGERS_LOKI_URI="http://172.17.0.1:3100/loki/api/v1/push"
# export PENPOT_AUDIT_LOG_ARCHIVE_URI="http://localhost:6070/api/audit"
export PENPOT_DEFAULT_RATE_LIMIT="default,window,10000/h"
# Initialize MINIO config
mc alias set penpot-s3/ http://minio:9000 minioadmin minioadmin
mc admin user add penpot-s3 penpot-devenv penpot-devenv
@@ -31,7 +29,7 @@ export PENPOT_STORAGE_ASSETS_S3_ENDPOINT=http://minio:9000
export PENPOT_STORAGE_ASSETS_S3_BUCKET=penpot
export OPTIONS="
-A:dev:jmx-remote \
-A:jmx-remote -A:dev \
-J-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \
-J-Dlog4j2.configurationFile=log4j2-devenv.xml \
-J-XX:+UseG1GC \

View File

@@ -1,19 +1,21 @@
#!/usr/bin/env bash
set +e
JAVA_CMD=$(type -p java)
set -e
if [[ ! -n "$JAVA_CMD" ]]; then
if [[ -n "$JAVA_HOME" ]] && [[ -x "$JAVA_HOME/bin/java" ]]; then
JAVA_CMD="$JAVA_HOME/bin/java"
else
>&2 echo "Couldn't find 'java'. Please set JAVA_HOME."
exit 1
fi
if [[ -n "$JAVA_HOME" ]] && [[ -x "$JAVA_HOME/bin/java" ]]; then
JAVA_CMD="$JAVA_HOME/bin/java"
else
set +e
JAVA_CMD=$(type -p java)
set -e
if [[ ! -n "$JAVA_CMD" ]]; then
>&2 echo "Couldn't find 'java'. Please set JAVA_HOME."
exit 1
fi
fi
fi
if [ -f ./environ ]; then
source ./environ
source ./environ
fi
set -x

View File

@@ -7,6 +7,7 @@
(ns app.auth.oidc
"OIDC client implementation."
(:require
[app.auth.oidc.providers :as-alias providers]
[app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex]
@@ -17,7 +18,9 @@
[app.db :as db]
[app.http.client :as http]
[app.http.middleware :as hmw]
[app.http.session :as session]
[app.loggers.audit :as audit]
[app.main :as-alias main]
[app.rpc.queries.profile :as profile]
[app.tokens :as tokens]
[app.util.json :as json]
@@ -47,9 +50,11 @@
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- discover-oidc-config
[{:keys [http-client]} {:keys [base-uri] :as opts}]
[cfg {:keys [::base-uri] :as opts}]
(let [discovery-uri (u/join base-uri ".well-known/openid-configuration")
response (ex/try (http/req! http-client {:method :get :uri (str discovery-uri)} {:sync? true}))]
response (ex/try! (http/req! cfg
{:method :get :uri (str discovery-uri)}
{:sync? true}))]
(cond
(ex/exception? response)
(do
@@ -59,7 +64,7 @@
nil)
(= 200 (:status response))
(let [data (json/read (:body response))]
(let [data (json/decode (:body response))]
{:token-uri (get data :token_endpoint)
:auth-uri (get data :authorization_endpoint)
:user-uri (get data :userinfo_endpoint)})
@@ -73,15 +78,15 @@
(defn- prepare-oidc-opts
[cfg]
(let [opts {:base-uri (:base-uri cfg)
:client-id (:client-id cfg)
:client-secret (:client-secret cfg)
:token-uri (:token-uri cfg)
:auth-uri (:auth-uri cfg)
:user-uri (:user-uri cfg)
:scopes (:scopes cfg #{"openid" "profile" "email"})
:roles-attr (:roles-attr cfg)
:roles (:roles cfg)
(let [opts {:base-uri (cf/get :oidc-base-uri)
:client-id (cf/get :oidc-client-id)
:client-secret (cf/get :oidc-client-secret)
:token-uri (cf/get :oidc-token-uri)
:auth-uri (cf/get :oidc-auth-uri)
:user-uri (cf/get :oidc-user-uri)
:scopes (cf/get :oidc-scopes #{"openid" "profile" "email"})
:roles-attr (cf/get :oidc-roles-attr)
:roles (cf/get :oidc-roles)
:name "oidc"}
opts (d/without-nils opts)]
@@ -96,13 +101,12 @@
(some-> (discover-oidc-config cfg opts)
(merge opts {:discover? true}))))))
(defmethod ig/prep-key ::generic-provider
[_ cfg]
(d/without-nils cfg))
(defmethod ig/pre-init-spec ::providers/generic [_]
(s/keys :req [::http/client]))
(defmethod ig/init-key ::generic-provider
(defmethod ig/init-key ::providers/generic
[_ cfg]
(when (:enabled? cfg)
(when (contains? cf/flags :login-with-oidc)
(if-let [opts (prepare-oidc-opts cfg)]
(do
(l/info :hint "provider initialized"
@@ -110,10 +114,10 @@
:method (if (:discover? opts) "discover" "manual")
:client-id (:client-id opts)
:client-secret (obfuscate-string (:client-secret opts))
:scopes (str/join "," (:scopes opts))
:auth-uri (:auth-uri opts)
:user-uri (:user-uri opts)
:token-uri (:token-uri opts)
:scopes (str/join "," (:scopes opts))
:auth-uri (:auth-uri opts)
:user-uri (:user-uri opts)
:token-uri (:token-uri opts)
:roles-attr (:roles-attr opts)
:roles (:roles opts))
opts)
@@ -125,21 +129,17 @@
;; GOOGLE AUTH PROVIDER
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod ig/prep-key ::google-provider
[_ cfg]
(d/without-nils cfg))
(defmethod ig/init-key ::google-provider
[_ cfg]
(let [opts {:client-id (:client-id cfg)
:client-secret (:client-secret cfg)
(defmethod ig/init-key ::providers/google
[_ _]
(let [opts {:client-id (cf/get :google-client-id)
:client-secret (cf/get :google-client-secret)
:scopes #{"openid" "email" "profile"}
:auth-uri "https://accounts.google.com/o/oauth2/v2/auth"
:token-uri "https://oauth2.googleapis.com/token"
:user-uri "https://openidconnect.googleapis.com/v1/userinfo"
:name "google"}]
(when (:enabled? cfg)
(when (contains? cf/flags :login-with-google)
(if (and (string? (:client-id opts))
(string? (:client-secret opts)))
(do
@@ -158,29 +158,29 @@
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- retrieve-github-email
[{:keys [http-client]} tdata info]
[cfg tdata info]
(or (some-> info :email p/resolved)
(-> (http/req! http-client {:uri "https://api.github.com/user/emails"
:headers {"Authorization" (dm/str (:type tdata) " " (:token tdata))}
:timeout 6000
:method :get})
(p/then (fn [{:keys [status body] :as response}]
(->> (http/req! cfg
{:uri "https://api.github.com/user/emails"
:headers {"Authorization" (dm/str (:type tdata) " " (:token tdata))}
:timeout 6000
:method :get})
(p/map (fn [{:keys [status body] :as response}]
(when-not (s/int-in-range? 200 300 status)
(ex/raise :type :internal
:code :unable-to-retrieve-github-emails
:hint "unable to retrieve github emails"
:http-status status
:http-body body))
(->> response :body json/read (filter :primary) first :email))))))
(->> response :body json/decode (filter :primary) first :email))))))
(defmethod ig/prep-key ::github-provider
[_ cfg]
(d/without-nils cfg))
(defmethod ig/pre-init-spec ::providers/github [_]
(s/keys :req [::http/client]))
(defmethod ig/init-key ::github-provider
(defmethod ig/init-key ::providers/github
[_ cfg]
(let [opts {:client-id (:client-id cfg)
:client-secret (:client-secret cfg)
(let [opts {:client-id (cf/get :github-client-id)
:client-secret (cf/get :github-client-secret)
:scopes #{"read:user" "user:email"}
:auth-uri "https://github.com/login/oauth/authorize"
:token-uri "https://github.com/login/oauth/access_token"
@@ -191,7 +191,7 @@
;; retrieve emails.
:get-email-fn (partial retrieve-github-email cfg)}]
(when (:enabled? cfg)
(when (contains? cf/flags :login-with-github)
(if (and (string? (:client-id opts))
(string? (:client-secret opts)))
(do
@@ -209,22 +209,18 @@
;; GITLAB AUTH PROVIDER
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod ig/prep-key ::gitlab-provider
[_ cfg]
(d/without-nils cfg))
(defmethod ig/init-key ::gitlab-provider
[_ cfg]
(let [base (:base-uri cfg "https://gitlab.com")
(defmethod ig/init-key ::providers/gitlab
[_ _]
(let [base (cf/get :gitlab-base-uri "https://gitlab.com")
opts {:base-uri base
:client-id (:client-id cfg)
:client-secret (:client-secret cfg)
:client-id (cf/get :gitlab-client-id)
:client-secret (cf/get :gitlab-client-secret)
:scopes #{"openid" "profile" "email"}
:auth-uri (str base "/oauth/authorize")
:token-uri (str base "/oauth/token")
:user-uri (str base "/oauth/userinfo")
:name "gitlab"}]
(when (:enabled? cfg)
(when (contains? cf/flags :login-with-gitlab)
(if (and (string? (:client-id opts))
(string? (:client-secret opts)))
(do
@@ -245,7 +241,7 @@
(defn- build-redirect-uri
[{:keys [provider] :as cfg}]
(let [public (u/uri (:public-uri cfg))]
(let [public (u/uri (cf/get :public-uri))]
(str (assoc public :path (str "/api/auth/oauth/" (:name provider) "/callback")))))
(defn- build-auth-uri
@@ -268,7 +264,7 @@
props))
(defn retrieve-access-token
[{:keys [provider http-client] :as cfg} code]
[{:keys [provider] :as cfg} code]
(let [params {:client_id (:client-id provider)
:client_secret (:client-secret provider)
:code code
@@ -279,25 +275,25 @@
"accept" "application/json"}
:uri (:token-uri provider)
:body (u/map->query-string params)}]
(p/then
(http/req! http-client req)
(fn [{:keys [status body] :as res}]
(if (= status 200)
(let [data (json/read body)]
{:token (get data :access_token)
:type (get data :token_type)})
(ex/raise :type :internal
:code :unable-to-retrieve-token
:http-status status
:http-body body))))))
(->> (http/req! cfg req)
(p/map (fn [{:keys [status body] :as res}]
(if (= status 200)
(let [data (json/decode body)]
{:token (get data :access_token)
:type (get data :token_type)})
(ex/raise :type :internal
:code :unable-to-retrieve-token
:http-status status
:http-body body)))))))
(defn- retrieve-user-info
[{:keys [provider http-client] :as cfg} tdata]
[{:keys [provider] :as cfg} tdata]
(letfn [(retrieve []
(http/req! http-client {:uri (:user-uri provider)
:headers {"Authorization" (str (:type tdata) " " (:token tdata))}
:timeout 6000
:method :get}))
(http/req! cfg
{:uri (:user-uri provider)
:headers {"Authorization" (str (:type tdata) " " (:token tdata))}
:timeout 6000
:method :get}))
(validate-response [response]
(when-not (s/int-in-range? 200 300 (:status response))
(ex/raise :type :internal
@@ -320,7 +316,7 @@
(get info attr-kw)))
(process-response [response]
(p/let [info (-> response :body json/read)
(p/let [info (-> response :body json/decode)
email (get-email info)]
{:backend (:name provider)
:email email
@@ -354,7 +350,7 @@
::props]))
(defn retrieve-info
[{:keys [sprops provider] :as cfg} {:keys [params] :as request}]
[{:keys [provider] :as cfg} {:keys [params] :as request}]
(letfn [(validate-oidc [info]
;; If the provider is OIDC, we can proceed to check
;; roles if they are defined.
@@ -393,7 +389,7 @@
(let [state (get params :state)
code (get params :code)
state (tokens/verify sprops {:token state :iss :oauth})]
state (tokens/verify (::main/props cfg) {:token state :iss :oauth})]
(-> (p/resolved code)
(p/then #(retrieve-access-token cfg %))
(p/then #(retrieve-user-info cfg %))
@@ -401,7 +397,7 @@
(p/then' (partial post-process state))))))
(defn- retrieve-profile
[{:keys [pool executor] :as cfg} info]
[{:keys [::db/pool ::wrk/executor] :as cfg} info]
(px/with-dispatch executor
(with-open [conn (db/open pool)]
(some->> (:email info)
@@ -414,23 +410,23 @@
(yrs/response :status 302 :headers {"location" (str uri)}))
(defn- generate-error-redirect
[cfg error]
(let [uri (-> (u/uri (:public-uri cfg))
[_ error]
(let [uri (-> (u/uri (cf/get :public-uri))
(assoc :path "/#/auth/login")
(assoc :query (u/map->query-string {:error "unable-to-auth" :hint (ex-message error)})))]
(redirect-response uri)))
(defn- generate-redirect
[{:keys [sprops session audit] :as cfg} request info profile]
[{:keys [::session/session] :as cfg} request info profile]
(if profile
(let [sxf ((:create session) (:id profile))
(let [sxf (session/create-fn session (:id profile))
token (or (:invitation-token info)
(tokens/generate sprops {:iss :auth
:exp (dt/in-future "15m")
:profile-id (:id profile)}))
(tokens/generate (::main/props cfg)
{:iss :auth
:exp (dt/in-future "15m")
:profile-id (:id profile)}))
params {:token token}
uri (-> (u/uri (:public-uri cfg))
uri (-> (u/uri (cf/get :public-uri))
(assoc :path "/#/auth/verify-token")
(assoc :query (u/map->query-string params)))]
@@ -438,13 +434,12 @@
(ex/raise :type :restriction
:code :profile-blocked))
(when (fn? audit)
(audit :cmd :submit
:type "command"
:name "login"
:profile-id (:id profile)
:ip-addr (audit/parse-client-ip request)
:props (audit/profile->props profile)))
(when-let [collector (::audit/collector cfg)]
(audit/submit! collector {:type "command"
:name "login"
:profile-id (:id profile)
:ip-addr (audit/parse-client-ip request)
:props (audit/profile->props profile)}))
(->> (redirect-response uri)
(sxf request)))
@@ -453,19 +448,19 @@
:iss :prepared-register
:is-active true
:exp (dt/in-future {:hours 48}))
token (tokens/generate sprops info)
token (tokens/generate (::main/props cfg) info)
params (d/without-nils
{:token token
:fullname (:fullname info)})
uri (-> (u/uri (:public-uri cfg))
uri (-> (u/uri (cf/get :public-uri))
(assoc :path "/#/auth/register/validate")
(assoc :query (u/map->query-string params)))]
(redirect-response uri))))
(defn- auth-handler
[{:keys [sprops] :as cfg} {:keys [params] :as request}]
[cfg {:keys [params] :as request}]
(let [props (audit/extract-utm-params params)
state (tokens/generate sprops
state (tokens/generate (::main/props cfg)
{:iss :oauth
:invitation-token (:invitation-token params)
:props props
@@ -491,7 +486,7 @@
{:compile
(fn [& _]
(fn [handler]
(fn [{:keys [providers] :as cfg} request]
(fn [{:keys [::providers] :as cfg} request]
(let [provider (some-> request :path-params :provider keyword)]
(if-let [provider (get providers provider)]
(handler (assoc cfg :provider provider) request)
@@ -500,44 +495,57 @@
:provider provider
:hint "provider not configured"))))))})
(s/def ::public-uri ::us/not-empty-string)
(s/def ::http-client ::http/client)
(s/def ::session map?)
(s/def ::sprops map?)
(s/def ::providers map?)
(s/def ::client-id ::cf/oidc-client-id)
(s/def ::client-secret ::cf/oidc-client-secret)
(s/def ::base-uri ::cf/oidc-base-uri)
(s/def ::token-uri ::cf/oidc-token-uri)
(s/def ::auth-uri ::cf/oidc-auth-uri)
(s/def ::user-uri ::cf/oidc-user-uri)
(s/def ::scopes ::cf/oidc-scopes)
(s/def ::roles ::cf/oidc-roles)
(s/def ::roles-attr ::cf/oidc-roles-attr)
(s/def ::email-attr ::cf/oidc-email-attr)
(s/def ::name-attr ::cf/oidc-name-attr)
;; FIXME: migrate to qualified-keywords
(s/def ::provider
(s/keys :req-un [::client-id
::client-secret]
:opt-un [::base-uri
::token-uri
::auth-uri
::user-uri
::scopes
::roles
::roles-attr
::email-attr
::name-attr]))
(s/def ::providers (s/map-of ::us/keyword (s/nilable ::provider)))
(defmethod ig/pre-init-spec ::routes
[_]
(s/keys :req-un [::public-uri
::session
::sprops
::http-client
::providers
::db/pool
::wrk/executor]))
(s/keys :req [::http/client
::wrk/executor
::main/props
::db/pool
::providers
::session/session]))
(defmethod ig/init-key ::routes
[_ {:keys [executor session] :as cfg}]
[_ {:keys [::wrk/executor ::session/session] :as cfg}]
(let [cfg (update cfg :provider d/without-nils)]
["" {:middleware [[(:middleware session)]
[hmw/with-dispatch executor]
[hmw/with-config cfg]
[provider-lookup]
]}
;; We maintain the both URI prefixes for backward compatibility.
["/auth/oauth"
["/:provider"
{:handler auth-handler
:allowed-methods #{:post}}]
["/:provider/callback"
{:handler callback-handler
:allowed-methods #{:get}}]]
["/auth/oidc"
["/:provider"
{:handler auth-handler
:allowed-methods #{:post}}]
["/:provider/callback"
{:handler callback-handler
:allowed-methods #{:get}}]]]))

View File

@@ -111,7 +111,7 @@
:id :verbosity
:default 1
:update-fn inc]
["-q" nil "Dont' print to console"
["-q" nil "Don't print to console"
:id :verbosity
:update-fn (constantly 0)]
["-h" "--help"]])

View File

@@ -27,6 +27,10 @@
clojure.lang.IRecord
clojure.lang.IDeref)
(prefer-method print-method
clojure.lang.IPersistentMap
clojure.lang.IDeref)
(prefer-method pprint/simple-dispatch
clojure.lang.IPersistentMap
clojure.lang.IDeref)
@@ -46,9 +50,11 @@
:database-username "penpot"
:database-password "penpot"
:default-blob-version 4
:default-blob-version 5
:loggers-zmq-uri "tcp://localhost:45556"
:rpc-rlimit-config (fs/path "resources/rlimit.edn")
:rpc-climit-config (fs/path "resources/climit.edn")
:file-change-snapshot-every 5
:file-change-snapshot-timeout "3h"
@@ -86,6 +92,7 @@
(s/def ::default-rpc-rlimit ::us/vector-of-strings)
(s/def ::rpc-rlimit-config ::fs/path)
(s/def ::rpc-climit-config ::fs/path)
(s/def ::media-max-file-size ::us/integer)
@@ -93,13 +100,17 @@
(s/def ::telemetry-enabled ::us/boolean)
(s/def ::audit-log-archive-uri ::us/string)
(s/def ::audit-log-http-handler-concurrency ::us/integer)
(s/def ::admins ::us/set-of-strings)
(s/def ::file-change-snapshot-every ::us/integer)
(s/def ::file-change-snapshot-timeout ::dt/duration)
(s/def ::default-executor-parallelism ::us/integer)
(s/def ::worker-executor-parallelism ::us/integer)
(s/def ::scheduled-executor-parallelism ::us/integer)
(s/def ::worker-default-parallelism ::us/integer)
(s/def ::worker-webhook-parallelism ::us/integer)
(s/def ::authenticated-cookie-domain ::us/string)
(s/def ::authenticated-cookie-name ::us/string)
@@ -144,7 +155,6 @@
(s/def ::http-server-max-multipart-body-size ::us/integer)
(s/def ::http-server-io-threads ::us/integer)
(s/def ::http-server-worker-threads ::us/integer)
(s/def ::initial-project-skey ::us/string)
(s/def ::ldap-attrs-email ::us/string)
(s/def ::ldap-attrs-fullname ::us/string)
(s/def ::ldap-attrs-username ::us/string)
@@ -168,11 +178,6 @@
(s/def ::redis-uri ::us/string)
(s/def ::registration-domain-whitelist ::us/set-of-strings)
(s/def ::semaphore-process-font ::us/integer)
(s/def ::semaphore-process-image ::us/integer)
(s/def ::semaphore-update-file ::us/integer)
(s/def ::semaphore-auth ::us/integer)
(s/def ::smtp-default-from ::us/string)
(s/def ::smtp-default-reply-to ::us/string)
(s/def ::smtp-host ::us/string)
@@ -203,6 +208,7 @@
::admins
::allow-demo-users
::audit-log-archive-uri
::audit-log-http-handler-concurrency
::auth-token-cookie-name
::auth-token-cookie-max-age
::authenticated-cookie-name
@@ -217,7 +223,9 @@
::default-rpc-rlimit
::error-report-webhook
::default-executor-parallelism
::worker-executor-parallelism
::scheduled-executor-parallelism
::worker-default-parallelism
::worker-webhook-parallelism
::file-change-snapshot-every
::file-change-snapshot-timeout
::user-feedback-destination
@@ -246,7 +254,6 @@
::http-server-max-multipart-body-size
::http-server-io-threads
::http-server-worker-threads
::initial-project-skey
::ldap-attrs-email
::ldap-attrs-fullname
::ldap-attrs-username
@@ -338,7 +345,8 @@
(when (ex/ex-info? e)
(println ";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;")
(println "Error on validating configuration:")
(println (us/pretty-explain (ex-data e)))
(println (some-> e ex-data ex/explain))
(println (ex/explain (ex-data e)))
(println ";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;"))
(throw e))))

View File

@@ -296,6 +296,7 @@
(let [row (get* ds table params opts)]
(when (and (not row) check-deleted?)
(ex/raise :type :not-found
:code :object-not-found
:table table
:hint "database object not found"))
row)))
@@ -308,6 +309,7 @@
(let [row (get* ds table params (assoc opts :check-deleted? check-not-found))]
(when (and (not row) check-not-found)
(ex/raise :type :not-found
:code :object-not-found
:table table
:hint "database object not found"))
row)))
@@ -352,10 +354,13 @@
[v]
(and (pgarray? v) (= "uuid" (.getBaseTypeName ^PgArray v))))
;; TODO rename to decode-pgarray-into
(defn decode-pgarray
([v] (some->> ^PgArray v .getArray vec))
([v in] (some->> ^PgArray v .getArray (into in)))
([v in xf] (some->> ^PgArray v .getArray (into in xf))))
([v] (decode-pgarray v []))
([v in]
(into in (some-> ^PgArray v .getArray)))
([v in xf]
(into in xf (some-> ^PgArray v .getArray))))
(defn pgarray->set
[v]
@@ -417,47 +422,53 @@
(defn decode-json-pgobject
[^PGobject o]
(let [typ (.getType o)
val (.getValue o)]
(if (or (= typ "json")
(= typ "jsonb"))
(json/read val)
val)))
(when o
(let [typ (.getType o)
val (.getValue o)]
(if (or (= typ "json")
(= typ "jsonb"))
(json/decode val)
val))))
(defn decode-transit-pgobject
[^PGobject o]
(let [typ (.getType o)
val (.getValue o)]
(if (or (= typ "json")
(= typ "jsonb"))
(t/decode-str val)
val)))
(when o
(let [typ (.getType o)
val (.getValue o)]
(if (or (= typ "json")
(= typ "jsonb"))
(t/decode-str val)
val))))
(defn inet
[ip-addr]
(doto (org.postgresql.util.PGobject.)
(.setType "inet")
(.setValue (str ip-addr))))
(when ip-addr
(doto (org.postgresql.util.PGobject.)
(.setType "inet")
(.setValue (str ip-addr)))))
(defn decode-inet
[^PGobject o]
(if (= "inet" (.getType o))
(.getValue o)
nil))
(when o
(if (= "inet" (.getType o))
(.getValue o)
nil)))
(defn tjson
"Encode as transit json."
[data]
(doto (org.postgresql.util.PGobject.)
(.setType "jsonb")
(.setValue (t/encode-str data {:type :json-verbose}))))
(when data
(doto (org.postgresql.util.PGobject.)
(.setType "jsonb")
(.setValue (t/encode-str data {:type :json-verbose})))))
(defn json
"Encode as plain json."
[data]
(doto (org.postgresql.util.PGobject.)
(.setType "jsonb")
(.setValue (json/write-str data))))
(when data
(doto (org.postgresql.util.PGobject.)
(.setType "jsonb")
(.setValue (json/encode-str data)))))
;; --- Locks
@@ -488,3 +499,18 @@
(let [n (xact-check-param n)
row (exec-one! conn ["select pg_try_advisory_xact_lock(?::bigint) as lock" n])]
(:lock row)))
(defn sql-exception?
[cause]
(instance? java.sql.SQLException cause))
(defn connection-error?
[cause]
(and (sql-exception? cause)
(contains? #{"08003" "08006" "08001" "08004"}
(.getSQLState ^java.sql.SQLException cause))))
(defn serialization-error?
[cause]
(and (sql-exception? cause)
(= "40001" (.getSQLState ^java.sql.SQLException cause))))

View File

@@ -11,6 +11,7 @@
[app.common.transit :as t]
[app.http.errors :as errors]
[app.http.middleware :as mw]
[app.http.session :as session]
[app.metrics :as mtx]
[app.worker :as wrk]
[clojure.spec.alpha :as s]
@@ -76,7 +77,7 @@
(defmethod ig/halt-key! ::server
[_ {:keys [server name port] :as cfg}]
(l/info :msg "stoping http server" :name name :port port)
(l/info :msg "stopping http server" :name name :port port)
(yt/stop! server))
(defn- not-found-handler
@@ -115,7 +116,6 @@
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(s/def ::assets map?)
(s/def ::audit-handler fn?)
(s/def ::awsns-handler fn?)
(s/def ::debug-routes (s/nilable vector?))
(s/def ::doc-routes (s/nilable vector?))
@@ -123,7 +123,7 @@
(s/def ::oauth map?)
(s/def ::oidc-routes (s/nilable vector?))
(s/def ::rpc-routes (s/nilable vector?))
(s/def ::session map?)
(s/def ::session ::session/session)
(s/def ::storage map?)
(s/def ::ws fn?)
@@ -137,7 +137,6 @@
::awsns-handler
::debug-routes
::oidc-routes
::audit-handler
::rpc-routes
::doc-routes]))
@@ -148,13 +147,14 @@
[mw/format-response]
[mw/params]
[mw/parse-request]
[session/middleware-1 session]
[mw/errors errors/handle]
[mw/restrict-methods]]}
["/metrics" {:handler (::mtx/handler metrics)
:allowed-methods #{:get}}]
["/assets" {:middleware [(:middleware session)]}
["/assets" {:middleware [[session/middleware-2 session]]}
["/by-id/:id" {:handler (:objects-handler assets)}]
["/by-file-media-id/:id" {:handler (:file-objects-handler assets)}]
["/by-file-media-id/:id/thumbnail" {:handler (:file-thumbnails-handler assets)}]]
@@ -165,14 +165,12 @@
["/sns" {:handler (:awsns-handler cfg)
:allowed-methods #{:post}}]]
["/ws/notifications" {:middleware [(:middleware session)]
["/ws/notifications" {:middleware [[session/middleware-2 session]]
:handler ws
:allowed-methods #{:get}}]
["/api" {:middleware [[mw/cors]
[(:middleware session)]]}
["/audit/events" {:handler (:audit-handler cfg)
:allowed-methods #{:post}}]
[session/middleware-2 session]]}
["/feedback" {:handler feedback
:allowed-methods #{:post}}]
(:doc-routes cfg)

View File

@@ -12,7 +12,9 @@
[app.db :as db]
[app.db.sql :as sql]
[app.http.client :as http]
[app.main :as-alias main]
[app.tokens :as tokens]
[app.worker :as-alias wrk]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[integrant.core :as ig]
@@ -26,21 +28,21 @@
(declare parse-notification)
(declare process-report)
(s/def ::http-client ::http/client)
(s/def ::sprops map?)
(defmethod ig/pre-init-spec ::handler [_]
(s/keys :req-un [::db/pool ::http-client ::sprops]))
(s/keys :req [::http/client
::main/props
::db/pool
::wrk/executor]))
(defmethod ig/init-key ::handler
[_ {:keys [executor] :as cfg}]
[_ {:keys [::wrk/executor] :as cfg}]
(fn [request respond _]
(let [data (-> request yrq/body slurp)]
(px/run! executor #(handle-request cfg data)))
(respond (yrs/response 200))))
(defn handle-request
[{:keys [http-client] :as cfg} data]
[cfg data]
(try
(let [body (parse-json data)
mtype (get body "Type")]
@@ -49,7 +51,7 @@
(let [surl (get body "SubscribeURL")
stopic (get body "TopicArn")]
(l/info :action "subscription received" :topic stopic :url surl)
(http/req! http-client {:uri surl :method :post :timeout 10000} {:sync? true}))
(http/req! cfg {:uri surl :method :post :timeout 10000} {:sync? true}))
(= mtype "Notification")
(when-let [message (parse-json (get body "Message"))]
@@ -100,10 +102,11 @@
(get mail "headers")))
(defn- extract-identity
[{:keys [sprops]} headers]
[cfg headers]
(let [tdata (get headers "x-penpot-data")]
(when-not (str/empty? tdata)
(let [result (tokens/verify sprops {:token tdata :iss :profile-identity})]
(let [sprops (::main/props cfg)
result (tokens/verify sprops {:token tdata :iss :profile-identity})]
(:profile-id result)))))
(defn- parse-notification
@@ -136,7 +139,7 @@
(j/read-value v)))
(defn- register-bounce-for-profile
[{:keys [pool]} {:keys [type kind profile-id] :as report}]
[{:keys [::db/pool]} {:keys [type kind profile-id] :as report}]
(when (= kind "permanent")
(db/with-atomic [conn pool]
(db/insert! conn :profile-complaint-report
@@ -165,7 +168,7 @@
{:id profile-id}))))))
(defn- register-complaint-for-profile
[{:keys [pool]} {:keys [type profile-id] :as report}]
[{:keys [::db/pool]} {:keys [type profile-id] :as report}]
(db/with-atomic [conn pool]
(db/insert! conn :profile-complaint-report
{:profile-id profile-id

View File

@@ -7,34 +7,41 @@
(ns app.http.client
"Http client abstraction layer."
(:require
[app.common.spec :as us]
[app.worker :as wrk]
[clojure.spec.alpha :as s]
[integrant.core :as ig]
[java-http-clj.core :as http]))
[java-http-clj.core :as http])
(:import
java.net.http.HttpClient))
(s/def ::client fn?)
(s/def ::client #(instance? HttpClient %))
(s/def ::client-holder
(s/keys :req [::client]))
(defmethod ig/pre-init-spec :app.http/client [_]
(s/keys :req-un [::wrk/executor]))
(defmethod ig/pre-init-spec ::client [_]
(s/keys :req [::wrk/executor]))
(defmethod ig/init-key :app.http/client
[_ {:keys [executor] :as cfg}]
(let [client (http/build-client {:executor executor
:connect-timeout 30000 ;; 10s
:follow-redirects :always})]
(with-meta
(fn send
([req] (send req {}))
([req {:keys [response-type sync?] :or {response-type :string sync? false}}]
(if sync?
(http/send req {:client client :as response-type})
(http/send-async req {:client client :as response-type}))))
{::client client})))
(defmethod ig/init-key ::client
[_ {:keys [::wrk/executor] :as cfg}]
(http/build-client {:executor executor
:connect-timeout 30000 ;; 10s
:follow-redirects :always}))
(defn send!
([client req] (send! client req {}))
([client req {:keys [response-type sync?] :or {response-type :string sync? false}}]
(us/assert! ::client client)
(if sync?
(http/send req {:client client :as response-type})
(http/send-async req {:client client :as response-type}))))
(defn req!
"A convencience toplevel function for gradual migration to a new API
convention."
([client request]
(client request))
([client request options]
(client request options)))
([{:keys [::client] :as holder} request]
(us/assert! ::client-holder holder)
(send! client request {}))
([{:keys [::client] :as holder} request options]
(us/assert! ::client-holder holder)
(send! client request options)))

View File

@@ -14,8 +14,9 @@
[app.config :as cf]
[app.db :as db]
[app.http.middleware :as mw]
[app.http.session :as session]
[app.rpc.commands.binfile :as binf]
[app.rpc.mutations.files :refer [create-file]]
[app.rpc.commands.files.create :refer [create-file]]
[app.rpc.queries.profile :as profile]
[app.util.blob :as blob]
[app.util.template :as tmpl]
@@ -243,15 +244,19 @@
(yrs/response 404 "not found")))))
(def sql:error-reports
"select id, created_at from server_error_report order by created_at desc limit 100")
"SELECT id, created_at,
content->>'~:hint' AS hint
FROM server_error_report
ORDER BY created_at DESC
LIMIT 100")
(defn error-list-handler
[{:keys [pool]} request]
(when-not (authorized? pool request)
(ex/raise :type :authentication
:code :only-admins-allowed))
(let [items (db/exec! pool [sql:error-reports])
items (map #(update % :created-at dt/format-instant :rfc1123) items)]
(let [items (->> (db/exec! pool [sql:error-reports])
(map #(update % :created-at dt/format-instant :rfc1123)))]
(yrs/response :status 200
:body (-> (io/resource "app/templates/error-list.tmpl")
(tmpl/render {:items items}))
@@ -377,17 +382,15 @@
:code :only-admins-allowed))))))})
(s/def ::session map?)
(defmethod ig/pre-init-spec ::routes [_]
(s/keys :req-un [::db/pool ::wrk/executor ::session]))
(s/keys :req-un [::db/pool ::wrk/executor ::session/session]))
(defmethod ig/init-key ::routes
[_ {:keys [session pool executor] :as cfg}]
[["/readyz" {:middleware [[mw/with-dispatch executor]
[mw/with-config cfg]]
:handler health-handler}]
["/dbg" {:middleware [[(:middleware session)]
["/dbg" {:middleware [[session/middleware-2 session]
[with-authorization pool]
[mw/with-dispatch executor]
[mw/with-config cfg]]}

View File

@@ -7,9 +7,9 @@
(ns app.http.errors
"A errors handling for the http server."
(:require
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.logging :as l]
[app.common.spec :as us]
[app.http :as-alias http]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
@@ -26,16 +26,18 @@
(defn get-context
[request]
(merge
*context*
{:path (:path request)
:method (:method request)
:params (:params request)
:ip-addr (parse-client-ip request)
:profile-id (:profile-id request)}
(let [headers (:headers request)]
{:user-agent (get headers "user-agent")
:frontend-version (get headers "x-frontend-version" "unknown")})))
(let [claims (:session-token-claims request)]
(merge
*context*
{:path (:path request)
:method (:method request)
:params (:params request)
:ip-addr (parse-client-ip request)}
(d/without-nils
{:user-agent (yrq/get-header request "user-agent")
:frontend-version (or (yrq/get-header request "x-frontend-version")
"unknown")
:profile-id (:uid claims)}))))
(defmulti handle-exception
(fn [err & _rest]
@@ -61,7 +63,7 @@
(let [{:keys [code] :as data} (ex-data err)]
(cond
(= code :spec-validation)
(let [explain (us/pretty-explain data)]
(let [explain (ex/explain data)]
(yrs/response :status 400
:body (-> data
(dissoc ::s/problems ::s/value)
@@ -75,11 +77,11 @@
(defmethod handle-exception :assertion
[error request]
(let [edata (ex-data error)
explain (us/pretty-explain edata)]
(l/error ::l/raw (str (ex-message error) "\n" explain)
::l/context (get-context request)
:cause error)
(let [edata (ex-data error)
explain (ex/explain edata)]
(l/error :hint (ex-message error)
:cause error
::l/context (get-context request))
(yrs/response :status 500
:body {:type :server-error
:code :assertion
@@ -91,12 +93,29 @@
[err _]
(yrs/response 404 (ex-data err)))
(defmethod handle-exception :internal
[error request]
(let [{:keys [code] :as edata} (ex-data error)]
(cond
(= :concurrency-limit-reached code)
(yrs/response 429)
:else
(do
(l/error :hint (ex-message error)
:cause error
::l/context (get-context request))
(yrs/response 500 {:type :server-error
:code :unhandled
:hint (ex-message error)
:data edata})))))
(defmethod handle-exception org.postgresql.util.PSQLException
[error request]
(let [state (.getSQLState ^java.sql.SQLException error)]
(l/error ::l/raw (ex-message error)
::l/context (get-context request)
:cause error)
(l/error :hint (ex-message error)
:cause error
::l/context (get-context request))
(cond
(= state "57014")
(yrs/response 504 {:type :server-error
@@ -121,9 +140,9 @@
;; This means that exception is not a controlled exception.
(nil? edata)
(do
(l/error ::l/raw (ex-message error)
::l/context (get-context request)
:cause error)
(l/error :hint (ex-message error)
:cause error
::l/context (get-context request))
(yrs/response 500 {:type :server-error
:code :unexpected
:hint (ex-message error)}))
@@ -139,9 +158,9 @@
:else
(do
(l/error ::l/raw (ex-message error)
::l/context (get-context request)
:cause error)
(l/error :hint (ex-message error)
:cause error
::l/context (get-context request))
(yrs/response 500 {:type :server-error
:code :unhandled
:hint (ex-message error)

View File

@@ -19,6 +19,7 @@
[yetti.request :as yrq]
[yetti.response :as yrs])
(:import
com.fasterxml.jackson.core.JsonParseException
com.fasterxml.jackson.core.io.JsonEOFException
io.undertow.server.RequestTooBigException
java.io.OutputStream))
@@ -31,6 +32,12 @@
{:name ::params
:compile (constantly ymw/wrap-params)})
(def ^:private json-mapper
(json/mapper
{:encode-key-fn str/camel
:decode-key-fn (comp keyword str/kebab)
:pretty true}))
(defn wrap-parse-request
[handler]
(letfn [(process-request [request]
@@ -45,7 +52,7 @@
(str/starts-with? header "application/json")
(with-open [is (yrq/body request)]
(let [params (json/read is)]
(let [params (json/decode is json-mapper)]
(-> request
(assoc :body-params params)
(update :params merge params))))
@@ -60,10 +67,13 @@
:code :request-body-too-large
:hint (ex-message cause)))
(instance? JsonEOFException cause)
(or (instance? JsonEOFException cause)
(instance? JsonParseException cause))
(raise (ex/error :type :validation
:code :malformed-json
:hint (ex-message cause)))
:hint (ex-message cause)
:cause cause))
:else
(raise cause)))]
@@ -113,7 +123,32 @@
(finally
(.close ^OutputStream output-stream))))))
(format-response [response request]
(json-streamable-body [data]
(reify yrs/StreamableResponseBody
(-write-body-to-stream [_ _ output-stream]
(try
(with-open [bos (buffered-output-stream output-stream buffer-size)]
(json/write! bos data json-mapper))
(catch java.io.IOException _cause
;; Do nothing, EOF means client closes connection abruptly
nil)
(catch Throwable cause
(l/warn :hint "unexpected error on encoding response"
:cause cause))
(finally
(.close ^OutputStream output-stream))))))
(format-response-with-json [response _]
(let [body (yrs/body response)]
(if (or (boolean? body) (coll? body))
(-> response
(update :headers assoc "content-type" "application/json")
(assoc :body (json-streamable-body body)))
response)))
(format-response-with-transit [response request]
(let [body (yrs/body response)]
(if (or (boolean? body) (coll? body))
(let [qs (yrq/query request)
@@ -126,6 +161,20 @@
(assoc :body (transit-streamable-body body opts))))
response)))
(format-response [response request]
(let [accept (yrq/get-header request "accept")]
(cond
(or (= accept "application/transit+json")
(str/includes? accept "application/transit+json"))
(format-response-with-transit response request)
(or (= accept "application/json")
(str/includes? accept "application/json"))
(format-response-with-json response request)
:else
(format-response-with-transit response request))))
(process-response [response request]
(cond-> response
(map? response) (format-response request)))]

View File

@@ -5,6 +5,7 @@
;; Copyright (c) KALEIDOS INC
(ns app.http.session
(:refer-clojure :exclude [read])
(:require
[app.common.data :as d]
[app.common.logging :as l]
@@ -20,6 +21,10 @@
[promesa.exec :as px]
[yetti.request :as yrq]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; DEFAULTS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; A default cookie name for storing the session.
(def default-auth-token-cookie-name "auth-token")
@@ -33,35 +38,55 @@
;; Default age for automatic session renewal
(def default-renewal-max-age (dt/duration {:hours 6}))
(defprotocol ISessionStore
(read-session [store key])
(write-session [store key data])
(update-session [store data])
(delete-session [store key]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PROTOCOLS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- make-database-store
(defprotocol ISessionManager
(read [_ key])
(decode [_ key])
(write! [_ key data])
(update! [_ data])
(delete! [_ key]))
(s/def ::session #(satisfies? ISessionManager %))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; STORAGE IMPL
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- prepare-session-params
[sprops data]
(let [profile-id (:profile-id data)
user-agent (:user-agent data)
created-at (or (:created-at data) (dt/now))
token (tokens/generate sprops {:iss "authentication"
:iat created-at
:uid profile-id})]
{:user-agent user-agent
:profile-id profile-id
:created-at created-at
:updated-at created-at
:id token}))
(defn- database-manager
[{:keys [pool sprops executor]}]
(reify ISessionStore
(read-session [_ token]
(reify ISessionManager
(read [_ token]
(px/with-dispatch executor
(db/exec-one! pool (sql/select :http-session {:id token}))))
(write-session [_ _ data]
(decode [_ token]
(px/with-dispatch executor
(let [profile-id (:profile-id data)
user-agent (:user-agent data)
created-at (or (:created-at data) (dt/now))
token (tokens/generate sprops {:iss "authentication"
:iat created-at
:uid profile-id})
params {:user-agent user-agent
:profile-id profile-id
:created-at created-at
:updated-at created-at
:id token}]
(db/insert! pool :http-session params))))
(tokens/verify sprops {:token token :iss "authentication"})))
(update-session [_ data]
(write! [_ _ data]
(px/with-dispatch executor
(let [params (prepare-session-params sprops data)]
(db/insert! pool :http-session params)
params)))
(update! [_ data]
(let [updated-at (dt/now)]
(px/with-dispatch executor
(db/update! pool :http-session
@@ -69,83 +94,154 @@
{:id (:id data)})
(assoc data :updated-at updated-at))))
(delete-session [_ token]
(delete! [_ token]
(px/with-dispatch executor
(db/delete! pool :http-session {:id token})
nil))))
(defn make-inmemory-store
[{:keys [sprops]}]
(defn inmemory-manager
[{:keys [sprops executor]}]
(let [cache (atom {})]
(reify ISessionStore
(read-session [_ token]
(reify ISessionManager
(read [_ token]
(p/do (get @cache token)))
(write-session [_ _ data]
(p/do
(let [profile-id (:profile-id data)
user-agent (:user-agent data)
created-at (or (:created-at data) (dt/now))
token (tokens/generate sprops {:iss "authentication"
:iat created-at
:uid profile-id})
params {:user-agent user-agent
:created-at created-at
:updated-at created-at
:profile-id profile-id
:id token}]
(decode [_ token]
(px/with-dispatch executor
(tokens/verify sprops {:token token :iss "authentication"})))
(write! [_ _ data]
(p/do
(let [{:keys [token] :as params} (prepare-session-params sprops data)]
(swap! cache assoc token params)
params)))
(update-session [_ data]
(let [updated-at (dt/now)]
(swap! cache update (:id data) assoc :updated-at updated-at)
(assoc data :updated-at updated-at)))
(update! [_ data]
(p/do
(let [updated-at (dt/now)]
(swap! cache update (:id data) assoc :updated-at updated-at)
(assoc data :updated-at updated-at))))
(delete-session [_ token]
(delete! [_ token]
(p/do
(swap! cache dissoc token)
nil)))))
(s/def ::sprops map?)
(defmethod ig/pre-init-spec ::store [_]
(defmethod ig/pre-init-spec ::manager [_]
(s/keys :req-un [::db/pool ::wrk/executor ::sprops]))
(defmethod ig/init-key ::store
(defmethod ig/init-key ::manager
[_ {:keys [pool] :as cfg}]
(if (db/read-only? pool)
(make-inmemory-store cfg)
(make-database-store cfg)))
(inmemory-manager cfg)
(database-manager cfg)))
(defmethod ig/halt-key! ::store
(defmethod ig/halt-key! ::manager
[_ _])
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; MANAGER IMPL
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(declare assign-auth-token-cookie)
(declare assign-authenticated-cookie)
(declare clear-auth-token-cookie)
(declare clear-authenticated-cookie)
(defn create-fn
[manager profile-id]
(fn [request response]
(let [uagent (yrq/get-header request "user-agent")
params {:profile-id profile-id
:user-agent uagent}]
(-> (write! manager nil params)
(p/then (fn [session]
(l/trace :hint "create" :profile-id profile-id)
(-> response
(assign-auth-token-cookie session)
(assign-authenticated-cookie session))))))))
(defn delete-fn
[manager]
(letfn [(delete [{:keys [profile-id] :as request}]
(let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)
cookie (yrq/get-cookie request cname)]
(l/trace :hint "delete" :profile-id profile-id)
(some->> (:value cookie) (delete! manager))))]
(fn [request response]
(p/do
(delete request)
(-> response
(assoc :status 204)
(assoc :body nil)
(clear-auth-token-cookie)
(clear-authenticated-cookie))))))
(def middleware-1
(letfn [(wrap-handler [manager handler request respond raise]
(try
(let [claims (some->> (cf/get :auth-token-cookie-name default-auth-token-cookie-name)
(yrq/get-cookie request)
(decode manager))
request (cond-> request
(some? claims)
(assoc :session-token-claims claims))]
(handler request respond raise))
(catch Throwable _
(handler request respond raise))))]
{:name :session-1
:compile (fn [& _]
(fn [handler manager]
(partial wrap-handler manager handler)))}))
(def middleware-2
(letfn [(wrap-handler [manager handler request respond raise]
(-> (retrieve-session manager request)
(p/finally (fn [session cause]
(cond
(some? cause)
(raise cause)
(nil? session)
(handler request respond raise)
:else
(let [request (-> request
(assoc :profile-id (:profile-id session))
(assoc :session-id (:id session)))
respond (cond-> respond
(renew-session? session)
(wrap-respond manager session))]
(handler request respond raise)))))))
(retrieve-session [manager request]
(let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)
cookie (yrq/get-cookie request cname)]
(some->> (:value cookie) (read manager))))
(renew-session? [{:keys [updated-at] :as session}]
(and (dt/instant? updated-at)
(let [elapsed (dt/diff updated-at (dt/now))]
(neg? (compare default-renewal-max-age elapsed)))))
;; Wrap respond with session renewal code
(wrap-respond [respond manager session]
(fn [response]
(p/let [session (update! manager session)]
(-> response
(assign-auth-token-cookie session)
(assign-authenticated-cookie session)
(respond)))))]
{:name :session-2
:compile (fn [& _]
(fn [handler manager]
(partial wrap-handler manager handler)))}))
;; --- IMPL
(defn- create-session!
[store profile-id user-agent]
(let [params {:user-agent user-agent
:profile-id profile-id}]
(write-session store nil params)))
(defn- update-session!
[store session]
(update-session store session))
(defn- delete-session!
[store {:keys [cookies] :as request}]
(let [name (cf/get :auth-token-cookie-name default-auth-token-cookie-name)]
(when-let [token (get-in cookies [name :value])]
(delete-session store token))))
(defn- retrieve-session
[store request]
(let [cookie-name (cf/get :auth-token-cookie-name default-auth-token-cookie-name)]
(when-let [cookie (yrq/get-cookie request cookie-name)]
(read-session store (:value cookie)))))
(defn assign-auth-token-cookie
(defn- assign-auth-token-cookie
[response {token :id updated-at :updated-at}]
(let [max-age (cf/get :auth-token-cookie-max-age default-cookie-max-age)
created-at (or updated-at (dt/now))
@@ -164,7 +260,7 @@
:secure secure?}]
(update response :cookies assoc name cookie)))
(defn assign-authenticated-cookie
(defn- assign-authenticated-cookie
[response {updated-at :updated-at}]
(let [max-age (cf/get :auth-token-cookie-max-age default-cookie-max-age)
created-at (or updated-at (dt/now))
@@ -185,96 +281,23 @@
(string? domain)
(update :cookies assoc name cookie))))
(defn clear-auth-token-cookie
(defn- clear-auth-token-cookie
[response]
(let [name (cf/get :auth-token-cookie-name default-auth-token-cookie-name)]
(update response :cookies assoc name {:path "/" :value "" :max-age -1})))
(let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)]
(update response :cookies assoc cname {:path "/" :value "" :max-age -1})))
(defn- clear-authenticated-cookie
[response]
(let [name (cf/get :authenticated-cookie-name default-authenticated-cookie-name)
(let [cname (cf/get :authenticated-cookie-name default-authenticated-cookie-name)
domain (cf/get :authenticated-cookie-domain)]
(cond-> response
(string? domain)
(update :cookies assoc name {:domain domain :path "/" :value "" :max-age -1}))))
(defn- make-middleware
[{:keys [store] :as cfg}]
(letfn [;; Check if time reached for automatic session renewal
(renew-session? [{:keys [updated-at] :as session}]
(and (dt/instant? updated-at)
(let [elapsed (dt/diff updated-at (dt/now))]
(neg? (compare default-renewal-max-age elapsed)))))
;; Wrap respond with session renewal code
(wrap-respond [respond session]
(fn [response]
(p/let [session (update-session! store session)]
(-> response
(assign-auth-token-cookie session)
(assign-authenticated-cookie session)
(respond)))))]
{:name :session
:compile (fn [& _]
(fn [handler]
(fn [request respond raise]
(try
(-> (retrieve-session store request)
(p/finally (fn [session cause]
(cond
(some? cause)
(raise cause)
(nil? session)
(handler request respond raise)
:else
(let [request (-> request
(assoc :profile-id (:profile-id session))
(assoc :session-id (:id session)))
respond (cond-> respond
(renew-session? session)
(wrap-respond session))]
(handler request respond raise))))))
(catch Throwable cause
(raise cause))))))}))
(update :cookies assoc cname {:domain domain :path "/" :value "" :max-age -1}))))
;; --- STATE INIT: SESSION
(s/def ::store #(satisfies? ISessionStore %))
(defmethod ig/pre-init-spec :app.http/session [_]
(s/keys :req-un [::store]))
(defmethod ig/prep-key :app.http/session
[_ cfg]
(d/merge {:buffer-size 128}
(d/without-nils cfg)))
(defmethod ig/init-key :app.http/session
[_ {:keys [store] :as cfg}]
(-> cfg
(assoc :middleware (make-middleware cfg))
(assoc :create (fn [profile-id]
(fn [request response]
(p/let [uagent (yrq/get-header request "user-agent")
session (create-session! store profile-id uagent)]
(-> response
(assign-auth-token-cookie session)
(assign-authenticated-cookie session))))))
(assoc :delete (fn [request response]
(p/do
(delete-session! store request)
(-> response
(assoc :status 204)
(assoc :body nil)
(clear-auth-token-cookie)
(clear-authenticated-cookie)))))))
;; --- STATE INIT: SESSION GC
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; TASK: SESSION GC
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(declare sql:delete-expired)

View File

@@ -15,19 +15,25 @@
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.http.client :as http]
[app.loggers.audit.tasks :as-alias tasks]
[app.loggers.webhooks :as-alias webhooks]
[app.main :as-alias main]
[app.metrics :as mtx]
[app.tokens :as tokens]
[app.util.async :as aa]
[app.util.time :as dt]
[app.worker :as wrk]
[clojure.core.async :as a]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[integrant.core :as ig]
[lambdaisland.uri :as u]
[promesa.core :as p]
[promesa.exec :as px]
[yetti.request :as yrq]
[yetti.response :as yrs]))
[yetti.request :as yrq]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; HELPERS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn parse-client-ip
[request]
@@ -49,10 +55,22 @@
(assoc (->> sk str/kebab (keyword "penpot")) v))))]
(reduce-kv process-param {} params)))
(def ^:private
profile-props
[:id
:is-active
:is-muted
:auth-backend
:email
:default-team-id
:default-project-id
:fullname
:lang])
(defn profile->props
[profile]
(-> profile
(select-keys [:id :is-active :is-muted :auth-backend :email :default-team-id :default-project-id :fullname :lang])
(select-keys profile-props)
(merge (:props profile))
(d/without-nils)))
@@ -79,180 +97,119 @@
(update event :props #(into {} xform %))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; HTTP Handler
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(declare persist-http-events)
;; --- SPECS
(s/def ::profile-id ::us/uuid)
(s/def ::name ::us/string)
(s/def ::type ::us/string)
(s/def ::props (s/map-of ::us/keyword any?))
(s/def ::timestamp dt/instant?)
(s/def ::context (s/map-of ::us/keyword any?))
(s/def ::ip-addr ::us/string)
(s/def ::frontend-event
(s/keys :req-un [::type ::name ::props ::timestamp ::profile-id]
:opt-un [::context]))
(s/def ::webhooks/event? ::us/boolean)
(s/def ::webhooks/batch-timeout ::dt/duration)
(s/def ::webhooks/batch-key
(s/or :fn fn? :str string? :kw keyword?))
(s/def ::frontend-events (s/every ::frontend-event))
(defmethod ig/init-key ::http-handler
[_ {:keys [executor pool] :as cfg}]
(if (or (db/read-only? pool) (not (contains? cf/flags :audit-log)))
(do
(l/warn :hint "audit log http handler disabled or db is read-only")
(fn [_ respond _]
(respond (yrs/response 204))))
(letfn [(handler [{:keys [profile-id] :as request}]
(let [events (->> (:events (:params request))
(remove #(not= profile-id (:profile-id %)))
(us/conform ::frontend-events))
ip-addr (parse-client-ip request)
cfg (-> cfg
(assoc :source "frontend")
(assoc :events events)
(assoc :ip-addr ip-addr))]
(persist-http-events cfg)))
(handle-error [cause]
(let [xdata (ex-data cause)]
(if (= :spec-validation (:code xdata))
(l/error ::l/raw (str "spec validation on persist-events:\n" (us/pretty-explain xdata)))
(l/error :hint "error on persist-events" :cause cause))))]
(fn [request respond _]
;; Fire and forget, log error in case of errro
(-> (px/submit! executor #(handler request))
(p/catch handle-error))
(respond (yrs/response 204))))))
(defn- persist-http-events
[{:keys [pool events ip-addr source] :as cfg}]
(let [columns [:id :name :source :type :tracked-at :profile-id :ip-addr :props :context]
prepare-xf (map (fn [event]
[(uuid/next)
(:name event)
source
(:type event)
(:timestamp event)
(:profile-id event)
(db/inet ip-addr)
(db/tjson (:props event))
(db/tjson (d/without-nils (:context event)))]))]
(when (seq events)
(->> (into [] prepare-xf events)
(db/insert-multi! pool :audit-log columns)))))
(s/def ::event
(s/keys :req-un [::type ::name ::profile-id]
:opt-un [::ip-addr ::props]
:opt [::webhooks/event?
::webhooks/batch-timeout
::webhooks/batch-key]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; Collector
;; COLLECTOR
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; Defines a service that collects the audit/activity log using
;; internal database. Later this audit log can be transferred to
;; an external storage and data cleared.
(declare persist-events)
(s/def ::collector
(s/keys :req [::wrk/executor ::db/pool]))
(defmethod ig/pre-init-spec ::collector [_]
(s/keys :req-un [::db/pool ::wrk/executor]))
(s/def ::ip-addr string?)
(s/def ::backend-event
(s/keys :req-un [::type ::name ::profile-id]
:opt-un [::ip-addr ::props]))
(def ^:private backend-event-xform
(comp
(filter #(us/valid? ::backend-event %))
(map clean-props)))
(s/keys :req [::db/pool ::wrk/executor ::mtx/metrics]))
(defmethod ig/init-key ::collector
[_ {:keys [pool] :as cfg}]
[_ {:keys [::db/pool] :as cfg}]
(cond
(not (contains? cf/flags :audit-log))
(do
(l/info :hint "audit log collection disabled")
(constantly nil))
(db/read-only? pool)
(do
(l/warn :hint "audit log collection disabled, db is read-only")
(constantly nil))
(l/warn :hint "audit: disabled (db is read-only)")
:else
(let [input (a/chan 512 backend-event-xform)
buffer (aa/batch input {:max-batch-size 100
:max-batch-age (* 10 1000) ; 10s
:init []})]
(l/info :hint "audit log collector initialized")
(a/go-loop []
(when-let [[_type events] (a/<! buffer)]
(let [res (a/<! (persist-events cfg events))]
(when (ex/exception? res)
(l/error :hint "error on persisting events" :cause res))
(recur))))
cfg))
(fn [& {:keys [cmd] :as params}]
(case cmd
:stop
(a/close! input)
(defn- persist-event!
[pool event]
(us/verify! ::event event)
(let [params {:id (uuid/next)
:name (:name event)
:type (:type event)
:profile-id (:profile-id event)
:tracked-at (dt/now)
:ip-addr (:ip-addr event)
:props (:props event)}]
:submit
(let [params (-> params
(dissoc :cmd)
(assoc :tracked-at (dt/now)))]
(when-not (a/offer! input params)
(l/warn :hint "activity channel is full"))))))))
(when (contains? cf/flags :audit-log)
(db/insert! pool :audit-log
(-> params
(update :props db/tjson)
(update :ip-addr db/inet)
(assoc :source "backend"))))
(defn- persist-events
[{:keys [pool executor] :as cfg} events]
(letfn [(event->row [event]
[(uuid/next)
(:name event)
(:type event)
(:profile-id event)
(:tracked-at event)
(some-> (:ip-addr event) db/inet)
(db/tjson (:props event))
"backend"])]
(aa/with-thread executor
(when (seq events)
(db/with-atomic [conn pool]
(db/insert-multi! conn :audit-log
[:id :name :type :profile-id :tracked-at :ip-addr :props :source]
(sequence (keep event->row) events)))))))
(when (and (contains? cf/flags :webhooks)
(::webhooks/event? event))
(let [batch-key (::webhooks/batch-key event)
batch-timeout (::webhooks/batch-timeout event)]
(wrk/submit! ::wrk/conn pool
::wrk/task :process-webhook-event
::wrk/queue :webhooks
::wrk/max-retries 0
::wrk/delay (or batch-timeout 0)
::wrk/label (cond
(fn? batch-key) (batch-key (:props event))
(keyword? batch-key) (name batch-key)
(string? batch-key) batch-key
:else "default")
::wrk/dedupe true
::webhooks/event (-> params
(dissoc :ip-addr)
(dissoc :type)))))))
(defn submit!
"Submit audit event to the collector."
[{:keys [::wrk/executor ::db/pool] :as collector} params]
(us/assert! ::collector collector)
(->> (px/submit! executor (partial persist-event! pool (d/without-nils params)))
(p/merr (fn [cause]
(l/error :hint "audit: unexpected error processing event" :cause cause)
(p/resolved nil)))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; Archive Task
;; TASK: ARCHIVE
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; This is a task responsible to send the accumulated events to an
;; This is a task responsible to send the accumulated events to
;; external service for archival.
(declare archive-events)
(s/def ::http-client fn?)
(s/def ::uri ::us/string)
(s/def ::sprops map?)
(s/def ::tasks/uri ::us/string)
(defmethod ig/pre-init-spec ::archive-task [_]
(s/keys :req-un [::db/pool ::sprops ::http-client]
:opt-un [::uri]))
(defmethod ig/pre-init-spec ::tasks/archive-task [_]
(s/keys :req [::db/pool ::main/props ::http/client]))
(defmethod ig/init-key ::archive-task
[_ {:keys [uri] :as cfg}]
(fn [props]
(defmethod ig/init-key ::tasks/archive
[_ cfg]
(fn [params]
;; NOTE: this let allows overwrite default configured values from
;; the repl, when manually invoking the task.
(let [enabled (or (contains? cf/flags :audit-log-archive)
(:enabled props false))
uri (or uri (:uri props))
cfg (assoc cfg :uri uri)]
(:enabled params false))
uri (cf/get :audit-log-archive-uri)
uri (or uri (:uri params))
cfg (assoc cfg ::uri uri)]
(when (and enabled (not uri))
(ex/raise :type :internal
@@ -264,20 +221,21 @@
(let [n (archive-events cfg)]
(if n
(do
(aa/thread-sleep 100)
(px/sleep 100)
(recur (+ total n)))
(when (pos? total)
(l/trace :hint "events chunk archived" :num total)))))))))
(l/debug :hint "events archived" :total total)))))))))
(def sql:retrieve-batch-of-audit-log
"select * from audit_log
(def ^:private sql:retrieve-batch-of-audit-log
"select *
from audit_log
where archived_at is null
order by created_at asc
limit 256
for update skip locked;")
(defn archive-events
[{:keys [pool uri sprops http-client] :as cfg}]
[{:keys [::db/pool ::uri] :as cfg}]
(letfn [(decode-row [{:keys [props ip-addr context] :as row}]
(cond-> row
(db/pgobject? props)
@@ -301,9 +259,10 @@
:context]))
(send [events]
(let [token (tokens/generate sprops {:iss "authentication"
:iat (dt/now)
:uid uuid/zero})
(let [token (tokens/generate (::main/props cfg)
{:iss "authentication"
:iat (dt/now)
:uid uuid/zero})
body (t/encode {:events events})
headers {"content-type" "application/transit+json"
"origin" (cf/get :public-uri)
@@ -313,7 +272,7 @@
:method :post
:headers headers
:body body}
resp (http-client params {:sync? true})]
resp (http/req! cfg params {:sync? true})]
(if (= (:status resp) 204)
true
(do
@@ -334,7 +293,7 @@
(map row->event))
events (into [] xform rows)]
(when-not (empty? events)
(l/debug :action "archive-events" :uri uri :events (count events))
(l/trace :hint "archive events chunk" :uri uri :events (count events))
(when (send events)
(mark-as-archived conn rows)
(count events)))))))
@@ -343,7 +302,7 @@
;; GC Task
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def sql:clean-archived
(def ^:private sql:clean-archived
"delete from audit_log
where archived_at is not null")
@@ -354,10 +313,10 @@
(l/debug :hint "delete archived audit log entries" :deleted result)
result))
(defmethod ig/pre-init-spec ::gc-task [_]
(s/keys :req-un [::db/pool]))
(defmethod ig/pre-init-spec ::tasks/gc [_]
(s/keys :req [::db/pool]))
(defmethod ig/init-key ::gc-task
(defmethod ig/init-key ::tasks/gc
[_ cfg]
(fn [_]
(clean-archived cfg)))

View File

@@ -82,7 +82,7 @@
(a/go-loop []
(let [msg (a/<! output)]
(if (nil? msg)
(l/info :msg "stoping error reporting loop")
(l/info :msg "stopping error reporting loop")
(do
(a/<! (handle-event cfg msg))
(recur)))))

View File

@@ -8,58 +8,55 @@
"A Loki integration."
(:require
[app.common.logging :as l]
[app.common.spec :as us]
[app.config :as cfg]
[app.config :as cf]
[app.http.client :as http]
[app.loggers.zmq :as lzmq]
[app.util.json :as json]
[clojure.core.async :as a]
[clojure.spec.alpha :as s]
[integrant.core :as ig]))
[integrant.core :as ig]
[promesa.exec :as px]))
(declare ^:private handle-event)
(declare ^:private start-rcv-loop)
(s/def ::uri ::us/string)
(s/def ::receiver fn?)
(s/def ::http-client fn?)
(defmethod ig/pre-init-spec ::reporter [_]
(s/keys :req-un [ ::receiver ::http-client]
:opt-un [::uri]))
(s/keys :req [::http/client
::lzmq/receiver]))
(defmethod ig/init-key ::reporter
[_ {:keys [receiver uri] :as cfg}]
(when uri
(l/info :msg "initializing loki reporter" :uri uri)
(let [input (a/chan (a/dropping-buffer 2048))]
(receiver :sub input)
[_ cfg]
(when-let [uri (cf/get :loggers-loki-uri)]
(px/thread
{:name "penpot/loki-reporter"}
(l/info :hint "reporter started" :uri uri)
(let [input (a/chan (a/dropping-buffer 2048))
cfg (assoc cfg ::uri uri)]
(doto (Thread. #(start-rcv-loop cfg input))
(.setDaemon true)
(.setName "penpot/loki-sender")
(.start))
(try
(lzmq/sub! (::lzmq/receiver cfg) input)
(loop []
(when-let [msg (a/<!! input)]
(handle-event cfg msg)
(recur)))
input)))
(catch InterruptedException _
(l/debug :hint "reporter interrupted"))
(catch Throwable cause
(l/error :hint "unexpected exception"
:cause cause))
(finally
(a/close! input)
(l/info :hint "reporter terminated")))))))
(defmethod ig/halt-key! ::reporter
[_ output]
(when output
(a/close! output)))
(defn- start-rcv-loop
[cfg input]
(loop []
(let [msg (a/<!! input)]
(when-not (nil? msg)
(handle-event cfg msg)
(recur))))
(l/info :msg "stoping error reporting loop"))
[_ thread]
(some-> thread px/interrupt!))
(defn- prepare-payload
[event]
(let [labels {:host (cfg/get :host)
:tenant (cfg/get :tenant)
:version (:full cfg/version)
(let [labels {:host (cf/get :host)
:tenant (cf/get :tenant)
:version (:full cf/version)
:logger (:logger/name event)
:level (:logger/level event)}]
{:streams
@@ -69,15 +66,15 @@
(when-let [error (:trace event)]
(str "\n" error)))]]}]}))
(defn- make-request
[{:keys [http-client uri] :as cfg} payload]
(http-client {:uri uri
:timeout 3000
:method :post
:headers {"content-type" "application/json"}
:body (json/write payload)}
{:sync? true}))
[{:keys [::uri] :as cfg} payload]
(http/req! cfg
{:uri uri
:timeout 3000
:method :post
:headers {"content-type" "application/json"}
:body (json/encode payload)}
{:sync? true}))
(defn- handle-event
[cfg event]

View File

@@ -9,67 +9,69 @@
(:require
[app.common.logging :as l]
[app.config :as cf]
[app.http.client :as http]
[app.loggers.database :as ldb]
[app.loggers.zmq :as lzmq]
[app.util.json :as json]
[clojure.core.async :as a]
[clojure.spec.alpha :as s]
[integrant.core :as ig]
[promesa.core :as p]))
[promesa.exec :as px]))
(defonce enabled (atom true))
(defn- send-mattermost-notification!
[{:keys [http-client] :as cfg} {:keys [host id public-uri] :as event}]
(let [uri (:uri cfg)
text (str "Exception on (host: " host ", url: " public-uri "/dbg/error/" id ")\n"
(when-let [pid (:profile-id event)]
(str "- profile-id: #uuid-" pid "\n")))]
(p/then
(http-client {:uri uri
:method :post
:headers {"content-type" "application/json"}
:body (json/write-str {:text text})})
(fn [{:keys [status] :as rsp}]
(when (not= status 200)
(l/warn :hint "error on sending data to mattermost"
:response (pr-str rsp)))))))
[cfg {:keys [host id public-uri] :as event}]
(let [text (str "Exception on (host: " host ", url: " public-uri "/dbg/error/" id ")\n"
(when-let [pid (:profile-id event)]
(str "- profile-id: #uuid-" pid "\n")))
resp (http/req! cfg
{:uri (cf/get :error-report-webhook)
:method :post
:headers {"content-type" "application/json"}
:body (json/encode-str {:text text})}
{:sync? true})]
(when (not= 200 (:status resp))
(l/warn :hint "error on sending data"
:response (pr-str resp)))))
(defn handle-event
[cfg event]
(let [ch (a/chan)]
(-> (p/let [event (ldb/parse-event event)]
(send-mattermost-notification! cfg event))
(p/finally (fn [_ cause]
(when cause
(l/warn :hint "unexpected exception on error reporter" :cause cause))
(a/close! ch))))
ch))
(s/def ::http-client fn?)
(s/def ::uri ::cf/error-report-webhook)
(try
(let [event (ldb/parse-event event)]
(when @enabled
(send-mattermost-notification! cfg event)))
(catch Throwable cause
(l/warn :hint "unhandled error"
:cause cause))))
(defmethod ig/pre-init-spec ::reporter [_]
(s/keys :req-un [::http-client ::receiver]
:opt-un [::uri]))
(s/keys :req [::http/client
::lzmq/receiver]))
(defmethod ig/init-key ::reporter
[_ {:keys [receiver uri] :as cfg}]
(when uri
(l/info :msg "initializing mattermost error reporter" :uri uri)
(let [output (a/chan (a/sliding-buffer 128)
(filter (fn [event]
(= (:logger/level event) "error"))))]
(receiver :sub output)
(a/go-loop []
(let [msg (a/<! output)]
(if (nil? msg)
(l/info :msg "stoping error reporting loop")
(do
(a/<! (handle-event cfg msg))
(recur)))))
output)))
[_ cfg]
(when-let [uri (cf/get :error-report-webhook)]
(px/thread
{:name "penpot/mattermost-reporter"}
(l/info :msg "initializing error reporter" :uri uri)
(let [input (a/chan (a/sliding-buffer 128)
(filter #(= (:logger/level %) "error")))]
(try
(lzmq/sub! (::lzmq/receiver cfg) input)
(loop []
(when-let [msg (a/<!! input)]
(handle-event cfg msg)
(recur)))
(catch InterruptedException _
(l/debug :hint "reporter interrupted"))
(catch Throwable cause
(l/error :hint "unexpected error" :cause cause))
(finally
(a/close! input)
(l/info :hint "reporter terminated")))))))
(defmethod ig/halt-key! ::reporter
[_ output]
(when output
(a/close! output)))
[_ thread]
(some-> thread px/interrupt!))

View File

@@ -0,0 +1,174 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.loggers.webhooks
"A mattermost integration for error reporting."
(:require
[app.common.data :as d]
[app.common.logging :as l]
[app.common.transit :as t]
[app.common.uri :as uri]
[app.config :as cf]
[app.db :as db]
[app.http.client :as http]
[app.util.json :as json]
[app.util.time :as dt]
[app.worker :as wrk]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[integrant.core :as ig]))
;; --- PROC
(defn- lookup-webhooks-by-team
[pool team-id]
(db/exec! pool ["select * from webhook where team_id=? and is_active=true" team-id]))
(defn- lookup-webhooks-by-project
[pool project-id]
(let [sql [(str "select * from webhook as w"
" join project as p on (p.team_id = w.team_id)"
" where p.id = ? and w.is_active = true")
project-id]]
(db/exec! pool sql)))
(defn- lookup-webhooks-by-file
[pool file-id]
(let [sql [(str "select * from webhook as w"
" join project as p on (p.team_id = w.team_id)"
" join file as f on (f.project_id = p.id)"
" where f.id = ? and w.is_active = true")
file-id]]
(db/exec! pool sql)))
(defn- lookup-webhooks
[{:keys [::db/pool]} {:keys [props] :as event}]
(or (some->> (:team-id props) (lookup-webhooks-by-team pool))
(some->> (:project-id props) (lookup-webhooks-by-project pool))
(some->> (:file-id props) (lookup-webhooks-by-file pool))))
(defmethod ig/pre-init-spec ::process-event-handler [_]
(s/keys :req [::db/pool]))
(defmethod ig/init-key ::process-event-handler
[_ {:keys [::db/pool] :as cfg}]
(fn [{:keys [props] :as task}]
(let [event (::event props)]
(l/debug :hint "process webhook event"
:name (:name event))
(when-let [items (lookup-webhooks cfg event)]
;; (app.common.pprint/pprint items)
(l/trace :hint "webhooks found for event" :total (count items))
(db/with-atomic [conn pool]
(doseq [item items]
(wrk/submit! ::wrk/conn conn
::wrk/task :run-webhook
::wrk/queue :webhooks
::wrk/max-retries 3
::event event
::config item)))))))
;; --- RUN
(declare interpret-exception)
(declare interpret-response)
(def ^:private json-mapper
(json/mapper
{:encode-key-fn str/camel
:decode-key-fn (comp keyword str/kebab)
:pretty true}))
(defmethod ig/pre-init-spec ::run-webhook-handler [_]
(s/keys :req [::http/client ::db/pool]))
(defmethod ig/prep-key ::run-webhook-handler
[_ cfg]
(merge {::max-errors 3} (d/without-nils cfg)))
(defmethod ig/init-key ::run-webhook-handler
[_ {:keys [::db/pool ::max-errors] :as cfg}]
(letfn [(update-webhook! [whook err]
(if err
(let [sql [(str "update webhook "
" set error_code=?, "
" error_count=error_count+1 "
" where id=?")
err
(:id whook)]
res (db/exec-one! pool sql {:return-keys true})]
(when (>= (:error-count res) max-errors)
(db/update! pool :webhook {:is-active false} {:id (:id whook)})))
(db/update! pool :webhook
{:updated-at (dt/now)
:error-code nil
:error-count 0}
{:id (:id whook)})))
(report-delivery! [whook req rsp err]
(db/insert! pool :webhook-delivery
{:webhook-id (:id whook)
:created-at (dt/now)
:error-code err
:req-data (db/tjson req)
:rsp-data (db/tjson rsp)}))]
(fn [{:keys [props] :as task}]
(let [event (::event props)
whook (::config props)
body (case (:mtype whook)
"application/json" (json/encode-str event json-mapper)
"application/transit+json" (t/encode-str event)
"application/x-www-form-urlencoded" (uri/map->query-string event))]
(l/debug :hint "run webhook"
:event-name (:name event)
:webhook-id (:id whook)
:webhook-uri (:uri whook)
:webhook-mtype (:mtype whook))
(let [req {:uri (:uri whook)
:headers {"content-type" (:mtype whook)
"user-agent" (str/ffmt "penpot/%" (:main cf/version))}
:timeout (dt/duration "4s")
:method :post
:body body}]
(try
(let [rsp (http/req! cfg req {:response-type :input-stream :sync? true})
err (interpret-response rsp)]
(report-delivery! whook req rsp err)
(update-webhook! whook err))
(catch Throwable cause
(let [err (interpret-exception cause)]
(report-delivery! whook req nil err)
(update-webhook! whook err)
(when (= err "unknown")
(l/error :hint "unknown error on webhook request"
:cause cause))))))))))
(defn interpret-response
[{:keys [status] :as response}]
(when-not (or (= 200 status)
(= 204 status))
(str/ffmt "unexpected-status:%" status)))
(defn interpret-exception
[cause]
(cond
(instance? javax.net.ssl.SSLHandshakeException cause)
"ssl-validation-error"
(instance? java.net.ConnectException cause)
"connection-error"
(instance? java.net.http.HttpConnectTimeoutException cause)
"timeout"
))

View File

@@ -9,13 +9,15 @@
(:require
[app.common.exceptions :as ex]
[app.common.logging :as l]
[app.common.spec :as us]
[app.config :as cf]
[app.loggers.zmq.receiver :as-alias receiver]
[app.util.json :as json]
[app.util.time :as dt]
[clojure.core.async :as a]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[integrant.core :as ig])
[integrant.core :as ig]
[promesa.exec :as px])
(:import
org.zeromq.SocketType
org.zeromq.ZMQ$Socket
@@ -24,38 +26,56 @@
(declare prepare)
(declare start-rcv-loop)
(s/def ::endpoint ::us/string)
(defmethod ig/pre-init-spec ::receiver [_]
(s/keys :opt-un [::endpoint]))
(defmethod ig/init-key ::receiver
[_ {:keys [endpoint] :as cfg}]
(l/info :msg "initializing ZMQ receiver" :bind endpoint)
(let [buffer (a/chan 1)
[_ cfg]
(let [uri (cf/get :loggers-zmq-uri)
buffer (a/chan 1)
output (a/chan 1 (comp (filter map?)
(keep prepare)))
mult (a/mult output)]
(when endpoint
(let [thread (Thread. #(start-rcv-loop {:out buffer :endpoint endpoint}))]
(.setDaemon thread false)
(.setName thread "penpot/zmq-logger-receiver")
(.start thread)))
mult (a/mult output)
thread (when uri
(px/thread
{:name "penpot/zmq-receiver"
:daemon false}
(l/info :hint "receiver started")
(try
(start-rcv-loop buffer uri)
(catch InterruptedException _
(l/debug :hint "receiver interrupted"))
(catch java.lang.IllegalStateException cause
(if (= "errno 4" (ex-message cause))
(l/debug :hint "receiver interrupted")
(l/error :hint "unhandled error" :cause cause)))
(catch Throwable cause
(l/error :hint "unhandled error" :cause cause))
(finally
(l/info :hint "receiver terminated")))))]
(a/pipe buffer output)
(with-meta
(fn [cmd ch]
(case cmd
:sub (a/tap mult ch)
:unsub (a/untap mult ch))
ch)
{::output output
::buffer buffer
::mult mult})))
(-> cfg
(assoc ::receiver/mult mult)
(assoc ::receiver/thread thread)
(assoc ::receiver/output output)
(assoc ::receiver/buffer buffer))))
(s/def ::receiver/mult some?)
(s/def ::receiver/thread #(instance? Thread %))
(s/def ::receiver/output some?)
(s/def ::receiver/buffer some?)
(s/def ::receiver
(s/keys :req [::receiver/mult
::receiver/thread
::receiver/output
::receiver/buffer]))
(defn sub!
[{:keys [::receiver/mult]} ch]
(a/tap mult ch))
(defmethod ig/halt-key! ::receiver
[_ f]
(a/close! (::buffer (meta f))))
[_ {:keys [::receiver/buffer ::receiver/thread]}]
(some-> thread px/interrupt!)
(some-> buffer a/close!))
(def ^:private json-mapper
(json/mapper
@@ -63,23 +83,23 @@
:decode-key-fn (comp keyword str/kebab)}))
(defn- start-rcv-loop
([] (start-rcv-loop nil))
([{:keys [out endpoint] :or {endpoint "tcp://localhost:5556"}}]
(let [out (or out (a/chan 1))
zctx (ZContext. 1)
socket (.. zctx (createSocket SocketType/SUB))]
(.. socket (connect ^String endpoint))
(.. socket (subscribe ""))
(.. socket (setReceiveTimeOut 5000))
(loop []
(let [msg (.recv ^ZMQ$Socket socket)
msg (ex/ignoring (json/read msg json-mapper))
msg (if (nil? msg) :empty msg)]
(if (a/>!! out msg)
(recur)
(do
(.close ^java.lang.AutoCloseable socket)
(.destroy ^ZContext zctx))))))))
[output endpoint]
(let [zctx (ZContext. 1)
socket (.. zctx (createSocket SocketType/SUB))]
(try
(.. socket (connect ^String endpoint))
(.. socket (subscribe ""))
(.. socket (setReceiveTimeOut 5000))
(loop []
(let [msg (.recv ^ZMQ$Socket socket)
msg (ex/ignoring (json/decode msg json-mapper))
msg (if (nil? msg) :empty msg)]
(when (a/>!! output msg)
(recur))))
(finally
(.close ^java.lang.AutoCloseable socket)
(.destroy ^ZContext zctx)))))
(s/def ::logger-name string?)
(s/def ::level string?)

View File

@@ -6,103 +6,227 @@
(ns app.main
(:require
[app.auth.oidc]
[app.auth.oidc :as-alias oidc]
[app.auth.oidc.providers :as-alias oidc.providers]
[app.common.logging :as l]
[app.config :as cf]
[app.db :as-alias db]
[app.http.client :as-alias http.client]
[app.http.session :as-alias http.session]
[app.loggers.audit :as-alias audit]
[app.loggers.audit.tasks :as-alias audit.tasks]
[app.loggers.webhooks :as-alias webhooks]
[app.loggers.zmq :as-alias lzmq]
[app.metrics :as-alias mtx]
[app.metrics.definition :as-alias mdef]
[app.redis :as-alias rds]
[app.storage :as-alias sto]
[app.util.time :as dt]
[app.worker :as-alias wrk]
[cuerdas.core :as str]
[integrant.core :as ig])
(:gen-class))
(def default-metrics
{:update-file-changes
{::mdef/name "penpot_rpc_update_file_changes_total"
::mdef/help "A total number of changes submitted to update-file."
::mdef/type :counter}
:update-file-bytes-processed
{::mdef/name "penpot_rpc_update_file_bytes_processed_total"
::mdef/help "A total number of bytes processed by update-file."
::mdef/type :counter}
:rpc-mutation-timing
{::mdef/name "penpot_rpc_mutation_timing"
::mdef/help "RPC mutation method call timing."
::mdef/labels ["name"]
::mdef/type :histogram}
:rpc-command-timing
{::mdef/name "penpot_rpc_command_timing"
::mdef/help "RPC command method call timing."
::mdef/labels ["name"]
::mdef/type :histogram}
:rpc-query-timing
{::mdef/name "penpot_rpc_query_timing"
::mdef/help "RPC query method call timing."
::mdef/labels ["name"]
::mdef/type :histogram}
:websocket-active-connections
{::mdef/name "penpot_websocket_active_connections"
::mdef/help "Active websocket connections gauge"
::mdef/type :gauge}
:websocket-messages-total
{::mdef/name "penpot_websocket_message_total"
::mdef/help "Counter of processed messages."
::mdef/labels ["op"]
::mdef/type :counter}
:websocket-session-timing
{::mdef/name "penpot_websocket_session_timing"
::mdef/help "Websocket session timing (seconds)."
::mdef/type :summary}
:session-update-total
{::mdef/name "penpot_http_session_update_total"
::mdef/help "A counter of session update batch events."
::mdef/type :counter}
:tasks-timing
{::mdef/name "penpot_tasks_timing"
::mdef/help "Background tasks timing (milliseconds)."
::mdef/labels ["name"]
::mdef/type :summary}
:redis-eval-timing
{::mdef/name "penpot_redis_eval_timing"
::mdef/help "Redis EVAL commands execution timings (ms)"
::mdef/labels ["name"]
::mdef/type :summary}
:rpc-climit-queue-size
{::mdef/name "penpot_rpc_climit_queue_size"
::mdef/help "Current number of queued submissions on the CLIMIT."
::mdef/labels ["name"]
::mdef/type :gauge}
:rpc-climit-concurrency
{::mdef/name "penpot_rpc_climit_concurrency"
::mdef/help "Current number of used concurrency capacity on the CLIMIT"
::mdef/labels ["name"]
::mdef/type :gauge}
:rpc-climit-timing
{::mdef/name "penpot_rpc_climit_timing"
::mdef/help "Summary of the time between queuing and executing on the CLIMIT"
::mdef/labels ["name"]
::mdef/type :summary}
:audit-http-handler-queue-size
{::mdef/name "penpot_audit_http_handler_queue_size"
::mdef/help "Current number of queued submissions on the audit log http handler"
::mdef/labels []
::mdef/type :gauge}
:audit-http-handler-concurrency
{::mdef/name "penpot_audit_http_handler_concurrency"
::mdef/help "Current number of used concurrency capacity on the audit log http handler"
::mdef/labels []
::mdef/type :gauge}
:audit-http-handler-timing
{::mdef/name "penpot_audit_http_handler_timing"
::mdef/help "Summary of the time between queuing and executing on the audit log http handler"
::mdef/labels []
::mdef/type :summary}
:executors-active-threads
{::mdef/name "penpot_executors_active_threads"
::mdef/help "Current number of threads available in the executor service."
::mdef/labels ["name"]
::mdef/type :gauge}
:executors-completed-tasks
{::mdef/name "penpot_executors_completed_tasks_total"
::mdef/help "Approximate number of completed tasks by the executor."
::mdef/labels ["name"]
::mdef/type :counter}
:executors-running-threads
{::mdef/name "penpot_executors_running_threads"
::mdef/help "Current number of threads with state RUNNING."
::mdef/labels ["name"]
::mdef/type :gauge}
:executors-queued-submissions
{::mdef/name "penpot_executors_queued_submissions"
::mdef/help "Current number of queued submissions."
::mdef/labels ["name"]
::mdef/type :gauge}})
(def system-config
{:app.db/pool
{::db/pool
{:uri (cf/get :database-uri)
:username (cf/get :database-username)
:password (cf/get :database-password)
:read-only (cf/get :database-readonly false)
:metrics (ig/ref :app.metrics/metrics)
:metrics (ig/ref ::mtx/metrics)
:migrations (ig/ref :app.migrations/all)
:name :main
:min-size (cf/get :database-min-pool-size 0)
:max-size (cf/get :database-max-pool-size 60)}
;; Default thread pool for IO operations
[::default :app.worker/executor]
{:parallelism (cf/get :default-executor-parallelism 70)}
::wrk/executor
{::wrk/parallelism (cf/get :default-executor-parallelism 100)}
;; Dedicated thread pool for backround tasks execution.
[::worker :app.worker/executor]
{:parallelism (cf/get :worker-executor-parallelism 20)}
::wrk/scheduled-executor
{::wrk/parallelism (cf/get :scheduled-executor-parallelism 20)}
:app.worker/scheduler
{:parallelism 1
:prefix :scheduler}
:app.worker/executors
{:default (ig/ref [::default :app.worker/executor])
:worker (ig/ref [::worker :app.worker/executor])}
:app.worker/executor-monitor
{:metrics (ig/ref :app.metrics/metrics)
:executors (ig/ref :app.worker/executors)}
::wrk/monitor
{::mtx/metrics (ig/ref ::mtx/metrics)
::wrk/name "default"
::wrk/executor (ig/ref ::wrk/executor)}
:app.migrations/migrations
{}
:app.metrics/metrics
{}
::mtx/metrics
{:default default-metrics}
:app.migrations/all
{:main (ig/ref :app.migrations/migrations)}
:app.redis/redis
{:uri (cf/get :redis-uri)
:metrics (ig/ref :app.metrics/metrics)}
::rds/redis
{::rds/uri (cf/get :redis-uri)
::mtx/metrics (ig/ref ::mtx/metrics)}
:app.msgbus/msgbus
{:backend (cf/get :msgbus-backend :redis)
:executor (ig/ref [::default :app.worker/executor])
:redis (ig/ref :app.redis/redis)}
:executor (ig/ref ::wrk/executor)
:redis (ig/ref ::rds/redis)}
:app.storage.tmp/cleaner
{:executor (ig/ref [::worker :app.worker/executor])
:scheduler (ig/ref :app.worker/scheduler)}
{::wrk/executor (ig/ref ::wrk/executor)
::wrk/scheduled-executor (ig/ref ::wrk/scheduled-executor)}
:app.storage/gc-deleted-task
{:pool (ig/ref :app.db/pool)
:storage (ig/ref :app.storage/storage)
:executor (ig/ref [::worker :app.worker/executor])}
::sto/gc-deleted-task
{:pool (ig/ref ::db/pool)
:storage (ig/ref ::sto/storage)
:executor (ig/ref ::wrk/executor)}
:app.storage/gc-touched-task
{:pool (ig/ref :app.db/pool)}
::sto/gc-touched-task
{:pool (ig/ref ::db/pool)}
:app.http/client
{:executor (ig/ref [::default :app.worker/executor])}
::http.client/client
{::wrk/executor (ig/ref ::wrk/executor)}
:app.http/session
{:store (ig/ref :app.http.session/store)}
:app.http.session/store
{:pool (ig/ref :app.db/pool)
:app.http.session/manager
{:pool (ig/ref ::db/pool)
:sprops (ig/ref :app.setup/props)
:executor (ig/ref [::default :app.worker/executor])}
:executor (ig/ref ::wrk/executor)}
:app.http.session/gc-task
{:pool (ig/ref :app.db/pool)
{:pool (ig/ref ::db/pool)
:max-age (cf/get :auth-token-cookie-max-age)}
:app.http.awsns/handler
{:sprops (ig/ref :app.setup/props)
:pool (ig/ref :app.db/pool)
:http-client (ig/ref :app.http/client)
:executor (ig/ref [::worker :app.worker/executor])}
{::props (ig/ref :app.setup/props)
::db/pool (ig/ref ::db/pool)
::http.client/client (ig/ref ::http.client/client)
::wrk/executor (ig/ref ::wrk/executor)}
:app.http/server
{:port (cf/get :http-server-port)
:host (cf/get :http-server-host)
:router (ig/ref :app.http/router)
:metrics (ig/ref :app.metrics/metrics)
:executor (ig/ref [::default :app.worker/executor])
:metrics (ig/ref ::mtx/metrics)
:executor (ig/ref ::wrk/executor)
:io-threads (cf/get :http-server-io-threads)
:max-body-size (cf/get :http-server-max-body-size)
:max-multipart-body-size (cf/get :http-server-max-multipart-body-size)}
@@ -121,117 +245,98 @@
:bind-password (cf/get :ldap-bind-password)
:enabled? (contains? cf/flags :login-with-ldap)}
:app.auth.oidc/google-provider
{:enabled? (contains? cf/flags :login-with-google)
:client-id (cf/get :google-client-id)
:client-secret (cf/get :google-client-secret)}
::oidc.providers/google
{}
:app.auth.oidc/github-provider
{:enabled? (contains? cf/flags :login-with-github)
:http-client (ig/ref :app.http/client)
:client-id (cf/get :github-client-id)
:client-secret (cf/get :github-client-secret)}
::oidc.providers/github
{::http.client/client (ig/ref ::http.client/client)}
:app.auth.oidc/gitlab-provider
{:enabled? (contains? cf/flags :login-with-gitlab)
:base-uri (cf/get :gitlab-base-uri "https://gitlab.com")
:client-id (cf/get :gitlab-client-id)
:client-secret (cf/get :gitlab-client-secret)}
::oidc.providers/gitlab
{}
:app.auth.oidc/generic-provider
{:enabled? (contains? cf/flags :login-with-oidc)
:http-client (ig/ref :app.http/client)
::oidc.providers/generic
{::http.client/client (ig/ref ::http.client/client)}
:client-id (cf/get :oidc-client-id)
:client-secret (cf/get :oidc-client-secret)
::oidc/routes
{::http.client/client (ig/ref ::http.client/client)
::db/pool (ig/ref ::db/pool)
::props (ig/ref :app.setup/props)
::wrk/executor (ig/ref ::wrk/executor)
::oidc/providers {:google (ig/ref ::oidc.providers/google)
:github (ig/ref ::oidc.providers/github)
:gitlab (ig/ref ::oidc.providers/gitlab)
:oidc (ig/ref ::oidc.providers/generic)}
::audit/collector (ig/ref ::audit/collector)
::http.session/session (ig/ref :app.http.session/manager)}
:base-uri (cf/get :oidc-base-uri)
:token-uri (cf/get :oidc-token-uri)
:auth-uri (cf/get :oidc-auth-uri)
:user-uri (cf/get :oidc-user-uri)
:scopes (cf/get :oidc-scopes)
:roles-attr (cf/get :oidc-roles-attr)
:roles (cf/get :oidc-roles)}
:app.auth.oidc/routes
{:providers {:google (ig/ref :app.auth.oidc/google-provider)
:github (ig/ref :app.auth.oidc/github-provider)
:gitlab (ig/ref :app.auth.oidc/gitlab-provider)
:oidc (ig/ref :app.auth.oidc/generic-provider)}
:sprops (ig/ref :app.setup/props)
:http-client (ig/ref :app.http/client)
:pool (ig/ref :app.db/pool)
:session (ig/ref :app.http/session)
:public-uri (cf/get :public-uri)
:executor (ig/ref [::default :app.worker/executor])}
;; TODO: revisit the dependencies of this service, looks they are too much unused of them
:app.http/router
{:assets (ig/ref :app.http.assets/handlers)
:feedback (ig/ref :app.http.feedback/handler)
:session (ig/ref :app.http/session)
:session (ig/ref :app.http.session/manager)
:awsns-handler (ig/ref :app.http.awsns/handler)
:debug-routes (ig/ref :app.http.debug/routes)
:oidc-routes (ig/ref :app.auth.oidc/routes)
:oidc-routes (ig/ref ::oidc/routes)
:ws (ig/ref :app.http.websocket/handler)
:metrics (ig/ref :app.metrics/metrics)
:metrics (ig/ref ::mtx/metrics)
:public-uri (cf/get :public-uri)
:storage (ig/ref :app.storage/storage)
:audit-handler (ig/ref :app.loggers.audit/http-handler)
:storage (ig/ref ::sto/storage)
:rpc-routes (ig/ref :app.rpc/routes)
:doc-routes (ig/ref :app.rpc.doc/routes)
:executor (ig/ref [::default :app.worker/executor])}
:executor (ig/ref ::wrk/executor)}
:app.http.debug/routes
{:pool (ig/ref :app.db/pool)
:executor (ig/ref [::worker :app.worker/executor])
:storage (ig/ref :app.storage/storage)
:session (ig/ref :app.http/session)}
{:pool (ig/ref ::db/pool)
:executor (ig/ref ::wrk/executor)
:storage (ig/ref ::sto/storage)
:session (ig/ref :app.http.session/manager)}
:app.http.websocket/handler
{:pool (ig/ref :app.db/pool)
:metrics (ig/ref :app.metrics/metrics)
{:pool (ig/ref ::db/pool)
:metrics (ig/ref ::mtx/metrics)
:msgbus (ig/ref :app.msgbus/msgbus)}
:app.http.assets/handlers
{:metrics (ig/ref :app.metrics/metrics)
{:metrics (ig/ref ::mtx/metrics)
:assets-path (cf/get :assets-path)
:storage (ig/ref :app.storage/storage)
:executor (ig/ref [::default :app.worker/executor])
:storage (ig/ref ::sto/storage)
:executor (ig/ref ::wrk/executor)
:cache-max-age (dt/duration {:hours 24})
:signature-max-age (dt/duration {:hours 24 :minutes 5})}
:app.http.feedback/handler
{:pool (ig/ref :app.db/pool)
:executor (ig/ref [::default :app.worker/executor])}
{:pool (ig/ref ::db/pool)
:executor (ig/ref ::wrk/executor)}
:app.rpc/semaphores
{:metrics (ig/ref :app.metrics/metrics)
:executor (ig/ref [::default :app.worker/executor])}
:app.rpc/climit
{:metrics (ig/ref ::mtx/metrics)
:executor (ig/ref ::wrk/executor)}
:app.rpc/rlimit
{:executor (ig/ref [::worker :app.worker/executor])
:scheduler (ig/ref :app.worker/scheduler)}
{:executor (ig/ref ::wrk/executor)
:scheduled-executor (ig/ref ::wrk/scheduled-executor)}
:app.rpc/methods
{:pool (ig/ref :app.db/pool)
:session (ig/ref :app.http/session)
:sprops (ig/ref :app.setup/props)
:metrics (ig/ref :app.metrics/metrics)
:storage (ig/ref :app.storage/storage)
:msgbus (ig/ref :app.msgbus/msgbus)
:public-uri (cf/get :public-uri)
:redis (ig/ref :app.redis/redis)
:audit (ig/ref :app.loggers.audit/collector)
:ldap (ig/ref :app.auth.ldap/provider)
:http-client (ig/ref :app.http/client)
:rlimit (ig/ref :app.rpc/rlimit)
:executors (ig/ref :app.worker/executors)
:executor (ig/ref [::default :app.worker/executor])
:templates (ig/ref :app.setup/builtin-templates)
:semaphores (ig/ref :app.rpc/semaphores)
{::audit/collector (ig/ref ::audit/collector)
::http.client/client (ig/ref ::http.client/client)
::db/pool (ig/ref ::db/pool)
::wrk/executor (ig/ref ::wrk/executor)
:pool (ig/ref ::db/pool)
:session (ig/ref :app.http.session/manager)
:sprops (ig/ref :app.setup/props)
:metrics (ig/ref ::mtx/metrics)
:storage (ig/ref ::sto/storage)
:msgbus (ig/ref :app.msgbus/msgbus)
:public-uri (cf/get :public-uri)
:redis (ig/ref ::rds/redis)
:ldap (ig/ref :app.auth.ldap/provider)
:http-client (ig/ref ::http.client/client)
:climit (ig/ref :app.rpc/climit)
:rlimit (ig/ref :app.rpc/rlimit)
:executor (ig/ref ::wrk/executor)
:templates (ig/ref :app.setup/builtin-templates)
}
:app.rpc.doc/routes
@@ -240,20 +345,25 @@
:app.rpc/routes
{:methods (ig/ref :app.rpc/methods)}
:app.worker/registry
{:metrics (ig/ref :app.metrics/metrics)
::wrk/registry
{:metrics (ig/ref ::mtx/metrics)
:tasks
{:sendmail (ig/ref :app.emails/handler)
:objects-gc (ig/ref :app.tasks.objects-gc/handler)
:file-gc (ig/ref :app.tasks.file-gc/handler)
:file-xlog-gc (ig/ref :app.tasks.file-xlog-gc/handler)
:storage-gc-deleted (ig/ref :app.storage/gc-deleted-task)
:storage-gc-touched (ig/ref :app.storage/gc-touched-task)
:storage-gc-deleted (ig/ref ::sto/gc-deleted-task)
:storage-gc-touched (ig/ref ::sto/gc-touched-task)
:tasks-gc (ig/ref :app.tasks.tasks-gc/handler)
:telemetry (ig/ref :app.tasks.telemetry/handler)
:session-gc (ig/ref :app.http.session/gc-task)
:audit-log-archive (ig/ref :app.loggers.audit/archive-task)
:audit-log-gc (ig/ref :app.loggers.audit/gc-task)}}
:audit-log-archive (ig/ref ::audit.tasks/archive)
:audit-log-gc (ig/ref ::audit.tasks/gc)
:process-webhook-event
(ig/ref ::webhooks/process-event-handler)
:run-webhook
(ig/ref ::webhooks/run-webhook-handler)}}
:app.emails/sendmail
@@ -268,78 +378,78 @@
:app.emails/handler
{:sendmail (ig/ref :app.emails/sendmail)
:metrics (ig/ref :app.metrics/metrics)}
:metrics (ig/ref ::mtx/metrics)}
:app.tasks.tasks-gc/handler
{:pool (ig/ref :app.db/pool)
{:pool (ig/ref ::db/pool)
:max-age cf/deletion-delay}
:app.tasks.objects-gc/handler
{:pool (ig/ref :app.db/pool)
:storage (ig/ref :app.storage/storage)}
{:pool (ig/ref ::db/pool)
:storage (ig/ref ::sto/storage)}
:app.tasks.file-gc/handler
{:pool (ig/ref :app.db/pool)}
{:pool (ig/ref ::db/pool)}
:app.tasks.file-xlog-gc/handler
{:pool (ig/ref :app.db/pool)}
{:pool (ig/ref ::db/pool)}
:app.tasks.telemetry/handler
{:pool (ig/ref :app.db/pool)
:version (:full cf/version)
:uri (cf/get :telemetry-uri)
:sprops (ig/ref :app.setup/props)
:http-client (ig/ref :app.http/client)}
{::db/pool (ig/ref ::db/pool)
::http.client/client (ig/ref ::http.client/client)
::props (ig/ref :app.setup/props)}
:app.srepl/server
{:port (cf/get :srepl-port)
:host (cf/get :srepl-host)}
:app.setup/builtin-templates
{:http-client (ig/ref :app.http/client)}
{::http.client/client (ig/ref ::http.client/client)}
:app.setup/props
{:pool (ig/ref :app.db/pool)
{:pool (ig/ref ::db/pool)
:key (cf/get :secret-key)}
:app.loggers.zmq/receiver
{:endpoint (cf/get :loggers-zmq-uri)}
::lzmq/receiver
{}
:app.loggers.audit/http-handler
{:pool (ig/ref :app.db/pool)
:executor (ig/ref [::default :app.worker/executor])}
::audit/collector
{::db/pool (ig/ref ::db/pool)
::wrk/executor (ig/ref ::wrk/executor)
::mtx/metrics (ig/ref ::mtx/metrics)}
:app.loggers.audit/collector
{:pool (ig/ref :app.db/pool)
:executor (ig/ref [::worker :app.worker/executor])}
::audit.tasks/archive
{::props (ig/ref :app.setup/props)
::db/pool (ig/ref ::db/pool)
::http.client/client (ig/ref ::http.client/client)}
:app.loggers.audit/archive-task
{:uri (cf/get :audit-log-archive-uri)
:sprops (ig/ref :app.setup/props)
:pool (ig/ref :app.db/pool)
:http-client (ig/ref :app.http/client)}
::audit.tasks/gc
{::db/pool (ig/ref ::db/pool)}
:app.loggers.audit/gc-task
{:pool (ig/ref :app.db/pool)}
::webhooks/process-event-handler
{::db/pool (ig/ref ::db/pool)
::http.client/client (ig/ref ::http.client/client)}
::webhooks/run-webhook-handler
{::db/pool (ig/ref ::db/pool)
::http.client/client (ig/ref ::http.client/client)}
:app.loggers.loki/reporter
{:uri (cf/get :loggers-loki-uri)
:receiver (ig/ref :app.loggers.zmq/receiver)
:http-client (ig/ref :app.http/client)}
{::lzmq/receiver (ig/ref ::lzmq/receiver)
::http.client/client (ig/ref ::http.client/client)}
:app.loggers.mattermost/reporter
{:uri (cf/get :error-report-webhook)
:receiver (ig/ref :app.loggers.zmq/receiver)
:http-client (ig/ref :app.http/client)}
{::lzmq/receiver (ig/ref ::lzmq/receiver)
::http.client/client (ig/ref ::http.client/client)}
:app.loggers.database/reporter
{:receiver (ig/ref :app.loggers.zmq/receiver)
:pool (ig/ref :app.db/pool)
:executor (ig/ref [::worker :app.worker/executor])}
:pool (ig/ref ::db/pool)
:executor (ig/ref ::wrk/executor)}
:app.storage/storage
{:pool (ig/ref :app.db/pool)
:executor (ig/ref [::default :app.worker/executor])
::sto/storage
{:pool (ig/ref ::db/pool)
:executor (ig/ref ::wrk/executor)
:backends
{:assets-s3 (ig/ref [::assets :app.storage.s3/backend])
@@ -353,7 +463,7 @@
{:region (cf/get :storage-assets-s3-region)
:endpoint (cf/get :storage-assets-s3-endpoint)
:bucket (cf/get :storage-assets-s3-bucket)
:executor (ig/ref [::default :app.worker/executor])}
:executor (ig/ref ::wrk/executor)}
[::assets :app.storage.fs/backend]
{:directory (cf/get :storage-assets-fs-directory)}
@@ -361,12 +471,11 @@
(def worker-config
{:app.worker/cron
{:executor (ig/ref [::worker :app.worker/executor])
:scheduler (ig/ref :app.worker/scheduler)
:tasks (ig/ref :app.worker/registry)
:pool (ig/ref :app.db/pool)
:entries
{::wrk/cron
{::wrk/scheduled-executor (ig/ref ::wrk/scheduled-executor)
::wrk/registry (ig/ref ::wrk/registry)
::db/pool (ig/ref ::db/pool)
::wrk/entries
[{:cron #app/cron "0 0 * * * ?" ;; hourly
:task :file-xlog-gc}
@@ -399,11 +508,27 @@
{:cron #app/cron "30 */5 * * * ?" ;; every 5m
:task :audit-log-gc})]}
:app.worker/worker
{:executor (ig/ref [::worker :app.worker/executor])
:tasks (ig/ref :app.worker/registry)
:metrics (ig/ref :app.metrics/metrics)
:pool (ig/ref :app.db/pool)}})
::wrk/dispatcher
{::rds/redis (ig/ref ::rds/redis)
::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool)}
[::default ::wrk/worker]
{::wrk/parallelism (cf/get ::worker-default-parallelism 1)
::wrk/queue :default
::rds/redis (ig/ref ::rds/redis)
::wrk/registry (ig/ref ::wrk/registry)
::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool)}
[::webhook ::wrk/worker]
{::wrk/parallelism (cf/get ::worker-webhook-parallelism 1)
::wrk/queue :webhooks
::rds/redis (ig/ref ::rds/redis)
::wrk/registry (ig/ref ::wrk/registry)
::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool)}})
(def system nil)
@@ -417,7 +542,7 @@
(merge worker-config))
(ig/prep)
(ig/init))))
(l/info :msg "welcome to penpot"
(l/info :hint "welcome to penpot"
:flags (str/join "," (map name cf/flags))
:worker? (contains? cf/flags :backend-worker)
:version (:full cf/version)))
@@ -430,4 +555,9 @@
(defn -main
[& _args]
(start))
(try
(start)
(catch Throwable cause
(l/error :hint (ex-message cause)
:cause cause)
(System/exit -1))))

View File

@@ -220,7 +220,7 @@
(ttf-or-otf->woff [data]
;; NOTE: foutput is not used directly, it represents the
;; default output of the exection of the underlying
;; default output of the execution of the underlying
;; command.
(let [finput (tmp/tempfile :prefix "penpot.font." :suffix "")
foutput (fs/path (str finput ".woff"))

View File

@@ -38,110 +38,6 @@
;; METRICS SERVICE PROVIDER
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def default-metrics
{:update-file-changes
{::mdef/name "penpot_rpc_update_file_changes_total"
::mdef/help "A total number of changes submitted to update-file."
::mdef/type :counter}
:update-file-bytes-processed
{::mdef/name "penpot_rpc_update_file_bytes_processed_total"
::mdef/help "A total number of bytes processed by update-file."
::mdef/type :counter}
:rpc-mutation-timing
{::mdef/name "penpot_rpc_mutation_timing"
::mdef/help "RPC mutation method call timming."
::mdef/labels ["name"]
::mdef/type :histogram}
:rpc-command-timing
{::mdef/name "penpot_rpc_command_timing"
::mdef/help "RPC command method call timming."
::mdef/labels ["name"]
::mdef/type :histogram}
:rpc-query-timing
{::mdef/name "penpot_rpc_query_timing"
::mdef/help "RPC query method call timing."
::mdef/labels ["name"]
::mdef/type :histogram}
:websocket-active-connections
{::mdef/name "penpot_websocket_active_connections"
::mdef/help "Active websocket connections gauge"
::mdef/type :gauge}
:websocket-messages-total
{::mdef/name "penpot_websocket_message_total"
::mdef/help "Counter of processed messages."
::mdef/labels ["op"]
::mdef/type :counter}
:websocket-session-timing
{::mdef/name "penpot_websocket_session_timing"
::mdef/help "Websocket session timing (seconds)."
::mdef/type :summary}
:session-update-total
{::mdef/name "penpot_http_session_update_total"
::mdef/help "A counter of session update batch events."
::mdef/type :counter}
:tasks-timing
{::mdef/name "penpot_tasks_timing"
::mdef/help "Background tasks timing (milliseconds)."
::mdef/labels ["name"]
::mdef/type :summary}
:redis-eval-timing
{::mdef/name "penpot_redis_eval_timing"
::mdef/help "Redis EVAL commands execution timings (ms)"
::mdef/labels ["name"]
::mdef/type :summary}
:semaphore-queued-submissions
{::mdef/name "penpot_semaphore_queued_submissions"
::mdef/help "Current number of queued submissions on SEMAPHORE."
::mdef/labels ["name"]
::mdef/type :gauge}
:semaphore-used-permits
{::mdef/name "penpot_semaphore_used_permits"
::mdef/help "Current number of used permits on SEMAPHORE."
::mdef/labels ["name"]
::mdef/type :gauge}
:semaphore-timing
{::mdef/name "penpot_semaphore_timing"
::mdef/help "Total timing of SEMAPHORE."
::mdef/labels ["name"]
::mdef/type :summary}
:executors-active-threads
{::mdef/name "penpot_executors_active_threads"
::mdef/help "Current number of threads available in the executor service."
::mdef/labels ["name"]
::mdef/type :gauge}
:executors-completed-tasks
{::mdef/name "penpot_executors_completed_tasks_total"
::mdef/help "Aproximate number of completed tasks by the executor."
::mdef/labels ["name"]
::mdef/type :counter}
:executors-running-threads
{::mdef/name "penpot_executors_running_threads"
::mdef/help "Current number of threads with state RUNNING."
::mdef/labels ["name"]
::mdef/type :gauge}
:executors-queued-submissions
{::mdef/name "penpot_executors_queued_submissions"
::mdef/help "Current number of queued submissions."
::mdef/labels ["name"]
::mdef/type :gauge}})
(s/def ::mdef/name string?)
(s/def ::mdef/help string?)
(s/def ::mdef/labels (s/every string? :kind vector?))
@@ -169,8 +65,13 @@
::handler
::definitions]))
(s/def ::default ::definitions)
(defmethod ig/pre-init-spec ::metrics [_]
(s/keys :req-un [::default]))
(defmethod ig/init-key ::metrics
[_ _]
[_ cfg]
(l/info :action "initialize metrics")
(let [registry (create-registry)
definitions (reduce-kv (fn [res k v]
@@ -178,7 +79,7 @@
(create-collector)
(assoc res k)))
{}
default-metrics)]
(:default cfg))]
(us/verify! ::definitions definitions)

View File

@@ -247,6 +247,30 @@
{:name "0079-mod-profile-table"
:fn (mg/resource "app/migrations/sql/0079-mod-profile-table.sql")}
{:name "0080-mod-index-names"
:fn (mg/resource "app/migrations/sql/0080-mod-index-names.sql")}
{:name "0081-add-deleted-at-index-to-file-table"
:fn (mg/resource "app/migrations/sql/0081-add-deleted-at-index-to-file-table.sql")}
{:name "0082-add-features-column-to-file-table"
:fn (mg/resource "app/migrations/sql/0082-add-features-column-to-file-table.sql")}
{:name "0083-add-file-data-fragment-table"
:fn (mg/resource "app/migrations/sql/0083-add-file-data-fragment-table.sql")}
{:name "0084-add-features-column-to-file-change-table"
:fn (mg/resource "app/migrations/sql/0084-add-features-column-to-file-change-table.sql")}
{:name "0085-add-webhook-table"
:fn (mg/resource "app/migrations/sql/0085-add-webhook-table.sql")}
{:name "0086-add-webhook-delivery-table"
:fn (mg/resource "app/migrations/sql/0086-add-webhook-delivery-table.sql")}
{:name "0087-mod-task-table"
:fn (mg/resource "app/migrations/sql/0087-mod-task-table.sql")}
])

View File

@@ -0,0 +1,11 @@
ALTER INDEX team_font_variant_deleted_at_idx
RENAME TO team_font_variant__deleted_at__idx;
ALTER INDEX team_deleted_at_idx
RENAME TO team__deleted_at__idx;
ALTER INDEX profile_deleted_at_idx
RENAME TO profile__deleted_at__idx;
ALTER INDEX project_deleted_at_idx
RENAME TO project__deleted_at__idx;

View File

@@ -0,0 +1,3 @@
CREATE INDEX file__deleted_at__idx
ON file (deleted_at, id)
WHERE deleted_at IS NOT NULL;

View File

@@ -0,0 +1,2 @@
ALTER TABLE file
ADD COLUMN features text[] DEFAULT NULL;

View File

@@ -0,0 +1,15 @@
CREATE TABLE file_data_fragment (
id uuid NOT NULL,
file_id uuid NOT NULL REFERENCES file(id) ON DELETE CASCADE DEFERRABLE,
created_at timestamptz NOT NULL DEFAULT now(),
metadata jsonb NULL,
content bytea NOT NULL,
PRIMARY KEY (file_id, id)
);
ALTER TABLE file_data_fragment
ALTER COLUMN metadata SET STORAGE external,
ALTER COLUMN content SET STORAGE external;

View File

@@ -0,0 +1,8 @@
ALTER TABLE file_change
ADD COLUMN features text[] DEFAULT NULL;
ALTER TABLE file_change
ALTER COLUMN features SET STORAGE external;
ALTER TABLE file
ALTER COLUMN features SET STORAGE external;

View File

@@ -0,0 +1,25 @@
CREATE TABLE webhook (
id uuid PRIMARY KEY,
team_id uuid NOT NULL REFERENCES team(id) ON DELETE CASCADE DEFERRABLE,
created_at timestamptz NOT NULL DEFAULT now(),
updated_at timestamptz NOT NULL DEFAULT now(),
uri text NOT NULL,
mtype text NOT NULL,
error_code text NULL,
error_count smallint DEFAULT 0,
is_active boolean DEFAULT true,
secret_key text NULL
);
ALTER TABLE webhook
ALTER COLUMN uri SET STORAGE external,
ALTER COLUMN mtype SET STORAGE external,
ALTER COLUMN error_code SET STORAGE external,
ALTER COLUMN secret_key SET STORAGE external;
CREATE INDEX webhook__team_id__idx ON webhook (team_id);

View File

@@ -0,0 +1,16 @@
CREATE TABLE webhook_delivery (
webhook_id uuid NOT NULL REFERENCES webhook(id) ON DELETE CASCADE DEFERRABLE,
created_at timestamptz NOT NULL DEFAULT now(),
error_code text NULL,
req_data jsonb NULL,
rsp_data jsonb NULL,
PRIMARY KEY (webhook_id, created_at)
);
ALTER TABLE webhook_delivery
ALTER COLUMN error_code SET STORAGE external,
ALTER COLUMN req_data SET STORAGE external,
ALTER COLUMN rsp_data SET STORAGE external;

View File

@@ -0,0 +1,9 @@
ALTER TABLE task
ADD COLUMN label text NULL;
ALTER TABLE task
ALTER COLUMN label SET STORAGE external;
CREATE INDEX task__label__idx
ON task (label, name, queue)
WHERE status = 'new';

View File

@@ -20,7 +20,8 @@
[clojure.core.async :as a]
[clojure.spec.alpha :as s]
[integrant.core :as ig]
[promesa.core :as p]))
[promesa.core :as p]
[promesa.exec :as px]))
(set! *warn-on-reflection* true)
@@ -52,8 +53,8 @@
(s/def ::rcv-ch ::aa/channel)
(s/def ::pub-ch ::aa/channel)
(s/def ::state ::us/agent)
(s/def ::pconn ::redis/connection)
(s/def ::sconn ::redis/connection)
(s/def ::pconn ::redis/connection-holder)
(s/def ::sconn ::redis/connection-holder)
(s/def ::msgbus
(s/keys :req [::cmd-ch ::rcv-ch ::pub-ch ::state ::pconn ::sconn ::wrk/executor]))
@@ -122,8 +123,8 @@
(defn- redis-disconnect
[{:keys [::pconn ::sconn] :as cfg}]
(redis/close! pconn)
(redis/close! sconn))
(d/close! pconn)
(d/close! sconn))
(defn- conj-subscription
"A low level function that is responsible to create on-demand
@@ -138,7 +139,7 @@
(defn- disj-subscription
"A low level function responsible on removing subscriptions. The
subscription is trully removed from redis once no single local
subscription is truly removed from redis once no single local
subscription is look for it. Intended to be executed in agent."
[nsubs cfg topic chan]
(let [nsubs (disj nsubs chan)]
@@ -159,7 +160,7 @@
topics))))
(defn- unsubscribe-single-channel
"Auxiliar function responsible on removing a single local
"Auxiliary function responsible on removing a single local
subscription from the state."
[state cfg chan]
(let [topics (get-in state [:chans chan])
@@ -205,31 +206,33 @@
(when-let [closed (a/<! (send-to-topic topic message))]
(send-via executor state unsubscribe-channels cfg closed nil))))
]
(px/thread
{:name "penpot/msgbus-io-loop"}
(loop []
(let [[val port] (a/alts!! [pub-ch rcv-ch])]
(cond
(nil? val)
(do
(l/trace :hint "stopping io-loop, nil received")
(send-via executor state (fn [state]
(->> (vals state)
(mapcat identity)
(filter some?)
(run! a/close!))
nil)))
(a/go-loop []
(let [[val port] (a/alts! [pub-ch rcv-ch])]
(cond
(nil? val)
(do
(l/trace :hint "stoping io-loop, nil received")
(send-via executor state (fn [state]
(->> (vals state)
(mapcat identity)
(filter some?)
(run! a/close!))
nil)))
(= port rcv-ch)
(do
(a/<!! (process-incoming val))
(recur))
(= port rcv-ch)
(do
(a/<! (process-incoming val))
(recur))
(= port pub-ch)
(let [result (a/<! (redis-pub cfg val))]
(when (ex/exception? result)
(l/error :hint "unexpected error on publishing" :message val
:cause result))
(recur)))))))
(= port pub-ch)
(let [result (a/<!! (redis-pub cfg val))]
(when (ex/exception? result)
(l/error :hint "unexpected error on publishing"
:message val
:cause result))
(recur))))))))
(defn- redis-pub
"Publish a message to the redis server. Asynchronous operation,

View File

@@ -21,13 +21,19 @@
[promesa.core :as p])
(:import
clojure.lang.IDeref
clojure.lang.MapEntry
io.lettuce.core.KeyValue
io.lettuce.core.RedisClient
io.lettuce.core.RedisCommandInterruptedException
io.lettuce.core.RedisCommandTimeoutException
io.lettuce.core.RedisException
io.lettuce.core.RedisURI
io.lettuce.core.ScriptOutputType
io.lettuce.core.api.StatefulConnection
io.lettuce.core.api.StatefulRedisConnection
io.lettuce.core.api.async.RedisAsyncCommands
io.lettuce.core.api.async.RedisScriptingAsyncCommands
io.lettuce.core.api.sync.RedisCommands
io.lettuce.core.codec.ByteArrayCodec
io.lettuce.core.codec.RedisCodec
io.lettuce.core.codec.StringCodec
@@ -45,13 +51,12 @@
(declare initialize-resources)
(declare shutdown-resources)
(declare connect)
(declare close!)
(declare connect*)
(s/def ::timer
#(instance? Timer %))
(s/def ::connection
(s/def ::default-connection
#(or (instance? StatefulRedisConnection %)
(and (instance? IDeref %)
(instance? StatefulRedisConnection (deref %)))))
@@ -61,6 +66,13 @@
(and (instance? IDeref %)
(instance? StatefulRedisPubSubConnection (deref %)))))
(s/def ::connection
(s/or :default ::default-connection
:pubsub ::pubsub-connection))
(s/def ::connection-holder
(s/keys :req [::connection]))
(s/def ::redis-uri
#(instance? RedisURI %))
@@ -75,32 +87,37 @@
(s/def ::connect? ::us/boolean)
(s/def ::io-threads ::us/integer)
(s/def ::worker-threads ::us/integer)
(s/def ::cache #(instance? clojure.lang.Atom %))
(s/def ::redis
(s/keys :req [::resources ::redis-uri ::timer ::mtx/metrics]
:opt [::connection]))
(defmethod ig/pre-init-spec ::redis [_]
(s/keys :req-un [::uri ::mtx/metrics]
:opt-un [::timeout
::connect?
::io-threads
::worker-threads]))
(s/keys :req [::resources
::redis-uri
::timer
::mtx/metrics]
:opt [::connection
::cache]))
(defmethod ig/prep-key ::redis
[_ cfg]
(let [runtime (Runtime/getRuntime)
cpus (.availableProcessors ^Runtime runtime)]
(merge {:timeout (dt/duration 5000)
:io-threads (max 3 cpus)
:worker-threads (max 3 cpus)}
(d/without-nils cfg))))
(merge {::timeout (dt/duration "10s")
::io-threads (max 3 cpus)
::worker-threads (max 3 cpus)}
(d/without-nils cfg))))
(defmethod ig/pre-init-spec ::redis [_]
(s/keys :req [::uri ::mtx/metrics]
:opt [::timeout
::connect?
::io-threads
::worker-threads]))
(defmethod ig/init-key ::redis
[_ {:keys [connect?] :as cfg}]
(let [cfg (initialize-resources cfg)]
(cond-> cfg
connect? (assoc ::connection (connect cfg)))))
[_ {:keys [::connect?] :as cfg}]
(let [state (initialize-resources cfg)]
(cond-> state
connect? (assoc ::connection (connect* cfg {})))))
(defmethod ig/halt-key! ::redis
[_ state]
@@ -114,7 +131,7 @@
(defn- initialize-resources
"Initialize redis connection resources"
[{:keys [uri io-threads worker-threads connect? metrics] :as cfg}]
[{:keys [::uri ::io-threads ::worker-threads ::connect?] :as cfg}]
(l/info :hint "initialize redis resources"
:uri uri
:io-threads io-threads
@@ -131,34 +148,32 @@
redis-uri (RedisURI/create ^String uri)]
(-> cfg
(assoc ::mtx/metrics metrics)
(assoc ::cache (atom {}))
(assoc ::resources resources)
(assoc ::timer timer)
(assoc ::redis-uri redis-uri)
(assoc ::resources resources))))
(assoc ::cache (atom {}))
(assoc ::redis-uri redis-uri))))
(defn- shutdown-resources
[{:keys [::resources ::cache ::timer]}]
(run! close! (vals @cache))
(run! d/close! (vals @cache))
(when resources
(.shutdown ^ClientResources resources))
(when timer
(.stop ^Timer timer)))
(defn connect
[{:keys [::resources ::redis-uri] :as cfg}
& {:keys [timeout codec type] :or {codec default-codec type :default}}]
(defn connect*
[{:keys [::resources ::redis-uri] :as state}
{:keys [timeout codec type]
:or {codec default-codec type :default}}]
(us/assert! ::resources resources)
(let [client (RedisClient/create ^ClientResources resources ^RedisURI redis-uri)
timeout (or timeout (:timeout cfg))
timeout (or timeout (::timeout state))
conn (case type
:default (.connect ^RedisClient client ^RedisCodec codec)
:pubsub (.connectPubSub ^RedisClient client ^RedisCodec codec))]
(.setTimeout ^StatefulConnection conn ^Duration timeout)
(reify
IDeref
(deref [_] conn)
@@ -168,53 +183,113 @@
(.close ^StatefulConnection conn)
(.shutdown ^RedisClient client)))))
(defn connect
[state & {:as opts}]
(let [connection (connect* state opts)]
(-> state
(assoc ::connection connection)
(dissoc ::cache)
(vary-meta assoc `d/close! (fn [_] (d/close! connection))))))
(defn get-or-connect
[{:keys [::cache] :as state} key options]
(assoc state ::connection
(or (get @cache key)
(-> (swap! cache (fn [cache]
(when-let [prev (get cache key)]
(close! prev))
(assoc cache key (connect state options))))
(get key)))))
(-> state
(assoc ::connection
(or (get @cache key)
(-> (swap! cache (fn [cache]
(when-let [prev (get cache key)]
(d/close! prev))
(assoc cache key (connect* state options))))
(get key))))
(dissoc ::cache)))
(defn add-listener!
[conn listener]
(us/assert! ::pubsub-connection @conn)
[{:keys [::connection] :as conn} listener]
(us/assert! ::connection-holder conn)
(us/assert! ::pubsub-connection connection)
(us/assert! ::pubsub-listener listener)
(.addListener ^StatefulRedisPubSubConnection @conn
(.addListener ^StatefulRedisPubSubConnection @connection
^RedisPubSubListener listener)
conn)
(defn publish!
[conn topic message]
[{:keys [::connection] :as conn} topic message]
(us/assert! ::us/string topic)
(us/assert! ::us/bytes message)
(us/assert! ::connection @conn)
(us/assert! ::connection-holder conn)
(us/assert! ::default-connection connection)
(let [pcomm (.async ^StatefulRedisConnection @conn)]
(let [pcomm (.async ^StatefulRedisConnection @connection)]
(.publish ^RedisAsyncCommands pcomm ^String topic ^bytes message)))
(defn subscribe!
"Blocking operation, intended to be used on a worker/agent thread."
[conn & topics]
(us/assert! ::pubsub-connection @conn)
(let [topics (into-array String (map str topics))
cmd (.sync ^StatefulRedisPubSubConnection @conn)]
(.subscribe ^RedisPubSubCommands cmd topics)))
"Blocking operation, intended to be used on a thread/agent thread."
[{:keys [::connection] :as conn} & topics]
(us/assert! ::connection-holder conn)
(us/assert! ::pubsub-connection connection)
(try
(let [topics (into-array String (map str topics))
cmd (.sync ^StatefulRedisPubSubConnection @connection)]
(.subscribe ^RedisPubSubCommands cmd topics))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(defn unsubscribe!
"Blocking operation, intended to be used on a worker/agent thread."
[conn & topics]
(us/assert! ::pubsub-connection @conn)
(let [topics (into-array String (map str topics))
cmd (.sync ^StatefulRedisPubSubConnection @conn)]
(.unsubscribe ^RedisPubSubCommands cmd topics)))
"Blocking operation, intended to be used on a thread/agent thread."
[{:keys [::connection] :as conn} & topics]
(us/assert! ::connection-holder conn)
(us/assert! ::pubsub-connection connection)
(try
(let [topics (into-array String (map str topics))
cmd (.sync ^StatefulRedisPubSubConnection @connection)]
(.unsubscribe ^RedisPubSubCommands cmd topics))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(defn rpush!
[{:keys [::connection] :as conn} key payload]
(us/assert! ::connection-holder conn)
(us/assert! (or (and (vector? payload)
(every? bytes? payload))
(bytes? payload)))
(try
(let [cmd (.sync ^StatefulRedisConnection @connection)
data (if (vector? payload) payload [payload])
vals (make-array (. Class (forName "[B")) (count data))]
(loop [i 0 xs (seq data)]
(when xs
(aset ^"[[B" vals i ^bytes (first xs))
(recur (inc i) (next xs))))
(.rpush ^RedisCommands cmd
^String key
^"[[B" vals))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(defn blpop!
[{:keys [::connection] :as conn} timeout & keys]
(us/assert! ::connection-holder conn)
(try
(let [keys (into-array Object (map str keys))
cmd (.sync ^StatefulRedisConnection @connection)
timeout (/ (double (inst-ms timeout)) 1000.0)]
(when-let [res (.blpop ^RedisCommands cmd
^double timeout
^"[Ljava.lang.String;" keys)]
(MapEntry/create
(.getKey ^KeyValue res)
(.getValue ^KeyValue res))))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(defn open?
[conn]
(.isOpen ^StatefulConnection @conn))
[{:keys [::connection] :as conn}]
(us/assert! ::connection-holder conn)
(us/assert! ::pubsub-connection connection)
(.isOpen ^StatefulConnection @connection))
(defn pubsub-listener
[& {:keys [on-message on-subscribe on-unsubscribe]}]
@@ -243,10 +318,6 @@
(when on-unsubscribe
(on-unsubscribe nil topic count)))))
(defn close!
[o]
(.close ^AutoCloseable o))
(def ^:private scripts-cache (atom {}))
(def noop-fn (constantly nil))
@@ -262,12 +333,12 @@
::rscript/vals]))
(defn eval!
[{:keys [::mtx/metrics] :as state} script]
(us/assert! ::rscript/script script)
[{:keys [::mtx/metrics ::connection] :as state} script]
(us/assert! ::redis state)
(us/assert! ::connection-holder state)
(us/assert! ::rscript/script script)
(let [rconn (-> state ::connection deref)
cmd (.async ^StatefulRedisConnection rconn)
(let [cmd (.async ^StatefulRedisConnection @connection)
keys (into-array String (map str (::rscript/keys script)))
vals (into-array String (map str (::rscript/vals script)))
sname (::rscript/name script)]
@@ -276,44 +347,52 @@
(if (instance? io.lettuce.core.RedisNoScriptException cause)
(do
(l/error :hint "no script found" :name sname :cause cause)
(-> (load-script)
(p/then eval-script)))
(->> (load-script)
(p/mapcat eval-script)))
(if-let [on-error (::rscript/on-error script)]
(on-error cause)
(p/rejected cause))))
(eval-script [sha]
(let [tpoint (dt/tpoint)]
(-> (.evalsha ^RedisScriptingAsyncCommands cmd
^String sha
^ScriptOutputType ScriptOutputType/MULTI
^"[Ljava.lang.String;" keys
^"[Ljava.lang.String;" vals)
(p/then (fn [result]
(let [elapsed (tpoint)]
(mtx/run! metrics {:id :redis-eval-timing
:labels [(name sname)]
:val (inst-ms elapsed)})
(l/trace :hint "eval script"
:name (name sname)
:sha sha
:params (str/join "," (::rscript/vals script))
:elapsed (dt/format-duration elapsed))
result)))
(p/catch on-error))))
(->> (.evalsha ^RedisScriptingAsyncCommands cmd
^String sha
^ScriptOutputType ScriptOutputType/MULTI
^"[Ljava.lang.String;" keys
^"[Ljava.lang.String;" vals)
(p/fmap (fn [result]
(let [elapsed (tpoint)]
(mtx/run! metrics {:id :redis-eval-timing
:labels [(name sname)]
:val (inst-ms elapsed)})
(l/trace :hint "eval script"
:name (name sname)
:sha sha
:params (str/join "," (::rscript/vals script))
:elapsed (dt/format-duration elapsed))
result)))
(p/merr on-error))))
(read-script []
(-> script ::rscript/path io/resource slurp))
(load-script []
(l/trace :hint "load script" :name sname)
(-> (.scriptLoad ^RedisScriptingAsyncCommands cmd
(->> (.scriptLoad ^RedisScriptingAsyncCommands cmd
^String (read-script))
(p/then (fn [sha]
(p/map (fn [sha]
(swap! scripts-cache assoc sname sha)
sha))))]
(if-let [sha (get @scripts-cache sname)]
(eval-script sha)
(-> (load-script)
(p/then eval-script))))))
(->> (load-script)
(p/mapcat eval-script))))))
(defn timeout-exception?
[cause]
(instance? RedisCommandTimeoutException cause))
(defn exception?
[cause]
(instance? RedisException cause))

View File

@@ -6,23 +6,33 @@
(ns app.rpc
(:require
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.logging :as l]
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.http :as-alias http]
[app.http.client :as-alias http.client]
[app.http.session :as-alias http.session]
[app.loggers.audit :as audit]
[app.loggers.webhooks :as-alias webhooks]
[app.metrics :as mtx]
[app.msgbus :as-alias mbus]
[app.rpc.climit :as climit]
[app.rpc.cond :as cond]
[app.rpc.helpers :as rph]
[app.rpc.retry :as retry]
[app.rpc.rlimit :as rlimit]
[app.rpc.semaphore :as-alias rsem]
[app.storage :as-alias sto]
[app.util.services :as sv]
[app.util.time :as ts]
[app.worker :as-alias wrk]
[clojure.spec.alpha :as s]
[integrant.core :as ig]
[promesa.core :as p]
[promesa.exec :as px]
[yetti.request :as yrq]
[yetti.response :as yrs]))
(defn- default-handler
@@ -31,23 +41,29 @@
(defn- handle-response-transformation
[response request mdata]
(let [response (if (sv/wrapped? response) @response response)]
(if-let [transform-fn (:transform-response mdata)]
(p/do (transform-fn request response))
(p/resolved response))))
(let [transform-fn (reduce (fn [res-fn transform-fn]
(fn [request response]
(p/then (res-fn request response) #(transform-fn request %))))
(constantly response)
(::response-transform-fns mdata))]
(transform-fn request response)))
(defn- handle-before-comple-hook
[response mdata]
(when-let [hook-fn (:before-complete mdata)]
(doseq [hook-fn (::before-complete-fns mdata)]
(ex/ignoring (hook-fn)))
response)
(defn- handle-response
[request result]
(let [mdata (meta result)]
(p/-> (yrs/response 200 result (::http/headers mdata {}))
(handle-response-transformation request mdata)
(handle-before-comple-hook mdata))))
(if (fn? result)
(p/wrap (result request))
(let [mdata (meta result)]
(p/-> (yrs/response {:status (::http/status mdata 200)
:headers (::http/headers mdata {})
:body (rph/unwrap result)})
(handle-response-transformation request mdata)
(handle-before-comple-hook mdata)))))
(defn- rpc-query-handler
"Ring handler that dispatches query requests and convert between
@@ -72,7 +88,7 @@
internal async flow into ring async flow."
[methods {:keys [profile-id session-id params] :as request} respond raise]
(let [type (keyword (:type params))
data (into {::request request} params)
data (into {::http/request request} params)
data (if profile-id
(assoc data :profile-id profile-id ::session-id session-id)
(dissoc data :profile-id))
@@ -90,18 +106,20 @@
internal async flow into ring async flow."
[methods {:keys [profile-id session-id params] :as request} respond raise]
(let [cmd (keyword (:command params))
data (into {::request request} params)
etag (yrq/get-header request "if-none-match")
data (into {::http/request request ::cond/key etag} params)
data (if profile-id
(assoc data :profile-id profile-id ::session-id session-id)
(dissoc data :profile-id))
method (get methods cmd default-handler)]
(-> (method data)
(p/then (partial handle-response request))
(p/then respond)
(p/catch (fn [cause]
(let [context {:profile-id profile-id}]
(raise (ex/wrap-with-context cause context))))))))
(binding [cond/*enabled* true]
(-> (method data)
(p/then (partial handle-response request))
(p/then respond)
(p/catch (fn [cause]
(let [context {:profile-id profile-id}]
(raise (ex/wrap-with-context cause context)))))))))
(defn- wrap-metrics
"Wrap service method with metrics measurement."
@@ -123,44 +141,69 @@
[{:keys [executor] :as cfg} f mdata]
(with-meta
(fn [cfg params]
(-> (px/submit! executor #(f cfg params))
(p/bind p/wrap)))
(->> (px/submit! executor (px/wrap-bindings #(f cfg params)))
(p/mapcat p/wrap)
(p/map rph/wrap)))
mdata))
(defn- wrap-audit
[{:keys [audit] :as cfg} f mdata]
(if audit
(with-meta
(fn [cfg {:keys [::request] :as params}]
(p/finally (f cfg params)
(fn [result _]
(when result
(let [resultm (meta result)
profile-id (or (::audit/profile-id resultm)
(:profile-id result)
(:profile-id params))
props (or (::audit/replace-props resultm)
(-> params
(merge (::audit/props resultm))
(dissoc :type)))]
(audit :cmd :submit
:type (or (::audit/type resultm)
[cfg f mdata]
(if-let [collector (::audit/collector cfg)]
(letfn [(handle-audit [params result]
(let [resultm (meta result)
request (::http/request params)
profile-id (or (::audit/profile-id resultm)
(:profile-id result)
(:profile-id params)
uuid/zero)
props (or (::audit/replace-props resultm)
(-> params
(d/without-qualified)
(merge (::audit/props resultm))
(dissoc :profile-id)
(dissoc :type)))
event {:type (or (::audit/type resultm)
(::type cfg))
:name (or (::audit/name resultm)
(::sv/name mdata))
:profile-id profile-id
:ip-addr (some-> request audit/parse-client-ip)
:props (dissoc props ::request)))))))
mdata)
:props props
::webhooks/batch-key
(or (::webhooks/batch-key mdata)
(::webhooks/batch-key resultm))
::webhooks/batch-timeout
(or (::webhooks/batch-timeout mdata)
(::webhooks/batch-timeout resultm))
::webhooks/event?
(or (::webhooks/event? mdata)
(::webhooks/event? resultm)
false)}]
(audit/submit! collector event)))
(handle-request [cfg params]
(->> (f cfg params)
(p/mcat (fn [result]
(->> (handle-audit params result)
(p/map (constantly result)))))))]
(if-not (::audit/skip mdata)
(with-meta handle-request mdata)
f))
f))
(defn- wrap
[cfg f mdata]
(let [f (as-> f $
(wrap-dispatch cfg $ mdata)
(wrap-metrics cfg $ mdata)
(cond/wrap cfg $ mdata)
(retry/wrap-retry cfg $ mdata)
(rsem/wrap cfg $ mdata)
(wrap-metrics cfg $ mdata)
(climit/wrap cfg $ mdata)
(rlimit/wrap cfg $ mdata)
(wrap-audit cfg $ mdata))
@@ -172,6 +215,7 @@
(fn [{:keys [::request] :as params}]
;; Raise authentication error when rpc method requires auth but
;; no profile-id is found in the request.
(p/do!
(if (and auth? (not (uuid? (:profile-id params))))
(ex/raise :type :authentication
@@ -179,7 +223,6 @@
:hint "authentication required for this endpoint")
(let [params (us/conform spec (dissoc params ::request))]
(f cfg (assoc params ::request request))))))
mdata)))
(defn- process-method
@@ -224,32 +267,40 @@
'app.rpc.commands.comments
'app.rpc.commands.management
'app.rpc.commands.verify-token
'app.rpc.commands.search
'app.rpc.commands.auth
'app.rpc.commands.ldap
'app.rpc.commands.demo
'app.rpc.commands.files)
'app.rpc.commands.webhooks
'app.rpc.commands.audit
'app.rpc.commands.files
'app.rpc.commands.files.update
'app.rpc.commands.files.create
'app.rpc.commands.files.temp)
(map (partial process-method cfg))
(into {}))))
(s/def ::audit (s/nilable fn?))
(s/def ::http-client fn?)
(s/def ::ldap (s/nilable map?))
(s/def ::msgbus ::mbus/msgbus)
(s/def ::climit (s/nilable ::climit/climit))
(s/def ::rlimit (s/nilable ::rlimit/rlimit))
(s/def ::public-uri ::us/not-empty-string)
(s/def ::session map?)
(s/def ::storage some?)
(s/def ::sprops map?)
(defmethod ig/pre-init-spec ::methods [_]
(s/keys :req-un [::storage
::session
(s/keys :req [::audit/collector
::http.client/client
::db/pool
::wrk/executor]
:req-un [::sto/storage
::http.session/session
::sprops
::audit
::public-uri
::msgbus
::http-client
::rsem/semaphores
::rlimit/rlimit
::rlimit
::climit
::wrk/executor
::mtx/metrics
::db/pool
::ldap]))

View File

@@ -0,0 +1,204 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.climit
"Concurrencly limiter for RPC."
(:require
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.logging :as l]
[app.common.spec :as us]
[app.config :as cf]
[app.metrics :as mtx]
[app.rpc :as-alias rpc]
[app.util.services :as-alias sv]
[app.util.time :as dt]
[app.worker :as-alias wrk]
[clojure.edn :as edn]
[clojure.spec.alpha :as s]
[datoteka.fs :as fs]
[integrant.core :as ig]
[promesa.core :as p]
[promesa.exec :as px]
[promesa.exec.bulkhead :as pxb])
(:import
com.github.benmanes.caffeine.cache.Cache
com.github.benmanes.caffeine.cache.CacheLoader
com.github.benmanes.caffeine.cache.Caffeine
com.github.benmanes.caffeine.cache.RemovalListener))
(defn- capacity-exception?
[o]
(and (ex/ex-info? o)
(let [data (ex-data o)]
(and (= :bulkhead-error (:type data))
(= :capacity-limit-reached (:code data))))))
(defn invoke!
[limiter f]
(->> (px/submit! limiter f)
(p/hcat (fn [result cause]
(cond
(capacity-exception? cause)
(p/rejected
(ex/error :type :internal
:code :concurrency-limit-reached
:queue (-> limiter meta :bkey name)
:cause cause))
(some? cause)
(p/rejected cause)
:else
(p/resolved result))))))
(defn- create-limiter
[{:keys [executor metrics concurrency queue-size bkey skey]}]
(let [labels (into-array String [(name bkey)])
on-queue (fn [instance]
(l/trace :hint "enqueued"
:key (name bkey)
:skey (str skey)
:queue-size (get instance ::pxb/current-queue-size)
:concurrency (get instance ::pxb/current-concurrency))
(mtx/run! metrics
:id :rpc-climit-queue-size
:val (get instance ::pxb/current-queue-size)
:labels labels)
(mtx/run! metrics
:id :rpc-climit-concurrency
:val (get instance ::pxb/current-concurrency)
:labels labels))
on-run (fn [instance task]
(let [elapsed (- (inst-ms (dt/now))
(inst-ms task))]
(l/trace :hint "execute"
:key (name bkey)
:skey (str skey)
:elapsed (str elapsed "ms"))
(mtx/run! metrics
:id :rpc-climit-timing
:val elapsed
:labels labels)
(mtx/run! metrics
:id :rpc-climit-queue-size
:val (get instance ::pxb/current-queue-size)
:labels labels)
(mtx/run! metrics
:id :rpc-climit-concurrency
:val (get instance ::pxb/current-concurrency)
:labels labels)))
options {:executor executor
:concurrency concurrency
:queue-size (or queue-size Integer/MAX_VALUE)
:on-queue on-queue
:on-run on-run}]
(-> (pxb/create options)
(vary-meta assoc :bkey bkey :skey skey))))
(defn- create-cache
[{:keys [executor] :as params} config]
(let [listener (reify RemovalListener
(onRemoval [_ key _val cause]
(l/trace :hint "cache: remove" :key key :reason (str cause))))
loader (reify CacheLoader
(load [_ key]
(let [[bkey skey] key]
(when-let [config (get config bkey)]
(-> (merge params config)
(assoc :bkey bkey)
(assoc :skey skey)
(create-limiter))))))]
(.. (Caffeine/newBuilder)
(weakValues)
(executor executor)
(removalListener listener)
(build loader))))
(defprotocol IConcurrencyManager)
(s/def ::concurrency ::us/integer)
(s/def ::queue-size ::us/integer)
(s/def ::config
(s/map-of keyword?
(s/keys :req-un [::concurrency]
:opt-un [::queue-size])))
(defmethod ig/prep-key ::rpc/climit
[_ cfg]
(merge {:path (cf/get :rpc-climit-config)}
(d/without-nils cfg)))
(defmethod ig/pre-init-spec ::rpc/climit [_]
(s/keys :req-un [::wrk/executor ::mtx/metrics ::fs/path]))
(defmethod ig/init-key ::rpc/climit
[_ {:keys [path] :as params}]
(when (contains? cf/flags :rpc-climit)
(if-let [config (some->> path slurp edn/read-string)]
(do
(l/info :hint "initializing concurrency limit" :config (str path))
(us/verify! ::config config)
(let [cache (create-cache params config)]
^{::cache cache}
(reify
IConcurrencyManager
clojure.lang.IDeref
(deref [_] config)
clojure.lang.ILookup
(valAt [_ key]
(let [key (if (vector? key) key [key])]
(.get ^Cache cache key))))))
(l/warn :hint "unable to load configuration" :config (str path)))))
(s/def ::climit #(satisfies? IConcurrencyManager %))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PUBLIC API
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmacro with-dispatch
[lim & body]
`(if ~lim
(invoke! ~lim (^:once fn [] (p/wrap (do ~@body))))
(p/wrap (do ~@body))))
(defn wrap
[{:keys [climit]} f {:keys [::queue ::key-fn] :as mdata}]
(if (and (some? climit)
(some? queue))
(if-let [config (get @climit queue)]
(do
(l/debug :hint "wrap: instrumenting method"
:limit-name (name queue)
:service-name (::sv/name mdata)
:queue-size (or (:queue-size config) Integer/MAX_VALUE)
:concurrency (:concurrency config)
:keyed? (some? key-fn))
(if (some? key-fn)
(fn [cfg params]
(let [key [queue (key-fn params)]
lim (get climit key)]
(invoke! lim (partial f cfg params))))
(let [lim (get climit queue)]
(fn [cfg params]
(invoke! lim (partial f cfg params))))))
(do
(l/warn :hint "wrap: no config found"
:queue (name queue)
:service (::sv/name mdata))
f))
f))

View File

@@ -0,0 +1,86 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.audit
"Audit Log related RPC methods"
(:require
[app.common.data :as d]
[app.common.logging :as l]
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.http :as-alias http]
[app.loggers.audit :as audit]
[app.rpc.climit :as-alias climit]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.util.services :as sv]
[app.util.time :as dt]
[app.worker :as wrk]
[clojure.spec.alpha :as s]
[promesa.core :as p]
[promesa.exec :as px]))
(defn- event->row [event]
[(uuid/next)
(:name event)
(:source event)
(:type event)
(:timestamp event)
(:profile-id event)
(db/inet (:ip-addr event))
(db/tjson (:props event))
(db/tjson (d/without-nils (:context event)))])
(def ^:private event-columns
[:id :name :source :type :tracked-at
:profile-id :ip-addr :props :context])
(defn- handle-events
[{:keys [::db/pool]} {:keys [profile-id events ::http/request] :as params}]
(let [ip-addr (audit/parse-client-ip request)
xform (comp
(map #(assoc % :profile-id profile-id))
(map #(assoc % :ip-addr ip-addr))
(map #(assoc % :source "frontend"))
(filter :profile-id)
(map event->row))
events (sequence xform events)]
(when (seq events)
(db/insert-multi! pool :audit-log event-columns events))))
(s/def ::profile-id ::us/uuid)
(s/def ::name ::us/string)
(s/def ::type ::us/string)
(s/def ::props (s/map-of ::us/keyword any?))
(s/def ::timestamp dt/instant?)
(s/def ::context (s/map-of ::us/keyword any?))
(s/def ::event
(s/keys :req-un [::type ::name ::props ::timestamp]
:opt-un [::context]))
(s/def ::events (s/every ::event))
(s/def ::push-audit-events
(s/keys :req-un [::events ::profile-id]))
(sv/defmethod ::push-audit-events
{::climit/queue :push-audit-events
::climit/key-fn :profile-id
::audit/skip true
::doc/added "1.17"}
[{:keys [::db/pool ::wrk/executor] :as cfg} params]
(if (or (db/read-only? pool)
(not (contains? cf/flags :audit-log)))
(do
(l/warn :hint "audit: http handler disabled or db is read-only")
(rph/wrap nil))
(->> (px/submit! executor #(handle-events cfg params))
(p/fmap (constantly nil)))))

View File

@@ -13,11 +13,13 @@
[app.config :as cf]
[app.db :as db]
[app.emails :as eml]
[app.http.session :as session]
[app.loggers.audit :as audit]
[app.rpc.climit :as climit]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.mutations.teams :as teams]
[app.rpc.queries.profile :as profile]
[app.rpc.semaphore :as rsem]
[app.tokens :as tokens]
[app.util.services :as sv]
[app.util.time :as dt]
@@ -127,16 +129,16 @@
(tokens/verify sprops {:token token :iss :team-invitation}))
;; If invitation member-id does not matches the profile-id, we just proceed to ignore the
;; invitation because invitations matches exactly; and user can't loging with other email and
;; invitation because invitations matches exactly; and user can't login with other email and
;; accept invitation with other email
response (if (and (some? invitation) (= (:id profile) (:member-id invitation)))
{:invitation-token (:invitation-token params)}
profile)]
(with-meta response
{:transform-response ((:create session) (:id profile))
::audit/props (audit/profile->props profile)
::audit/profile-id (:id profile)})))))
(-> response
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/props (audit/profile->props profile)
::audit/profile-id (:id profile)}))))))
(s/def ::login-with-password
(s/keys :req-un [::email ::password]
@@ -145,7 +147,7 @@
(sv/defmethod ::login-with-password
"Performs authentication using penpot password."
{:auth false
::rsem/queue :auth
::climit/queue :auth
::doc/added "1.15"}
[cfg params]
(login-with-password cfg params))
@@ -160,8 +162,7 @@
{:auth false
::doc/added "1.15"}
[{:keys [session] :as cfg} _]
(with-meta {}
{:transform-response (:delete session)}))
(rph/with-transform {} (session/delete-fn session)))
;; ---- COMMAND: Recover Profile
@@ -186,7 +187,7 @@
(sv/defmethod ::recover-profile
{:auth false
::rsem/queue :auth
::climit/queue :auth
::doc/added "1.15"}
[cfg params]
(recover-profile cfg params))
@@ -375,8 +376,6 @@
(create-profile conn)
(create-profile-relations conn)
(profile/decode-profile-row)))
audit-fn (:audit cfg)
invitation (when-let [token (:invitation-token params)]
(tokens/verify sprops {:token token :iss :team-invitation}))]
@@ -385,10 +384,11 @@
;; accordingly.
(when-let [id (:profile-id claims)]
(db/update! conn :profile {:modified-at (dt/now)} {:id id})
(audit-fn :cmd :submit
:type "fact"
:name "register-profile-retry"
:profile-id id))
(when-let [collector (::audit/collector cfg)]
(audit/submit! collector
{:type "fact"
:name "register-profile-retry"
:profile-id id})))
(cond
;; If invitation token comes in params, this is because the
@@ -401,33 +401,33 @@
(let [claims (assoc invitation :member-id (:id profile))
token (tokens/generate sprops claims)
resp {:invitation-token token}]
(with-meta resp
{:transform-response ((:create session) (:id profile))
::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)}))
(-> resp
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)})))
;; If auth backend is different from "penpot" means user is
;; registering using third party auth mechanism; in this case
;; we need to mark this session as logged.
(not= "penpot" (:auth-backend profile))
(with-meta (profile/strip-private-attrs profile)
{:transform-response ((:create session) (:id profile))
::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)})
(-> (profile/strip-private-attrs profile)
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)}))
;; If the `:enable-insecure-register` flag is set, we proceed
;; to sign in the user directly, without email verification.
(true? is-active)
(with-meta (profile/strip-private-attrs profile)
{:transform-response ((:create session) (:id profile))
::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)})
(-> (profile/strip-private-attrs profile)
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)}))
;; In all other cases, send a verification email.
:else
(do
(send-email-verification! conn sprops profile)
(with-meta profile
(rph/with-meta profile
{::audit/replace-props (audit/profile->props profile)
::audit/profile-id (:id profile)})))))
@@ -436,7 +436,7 @@
(sv/defmethod ::register-profile
{:auth false
::rsem/queue :auth
::climit/queue :auth
::doc/added "1.15"}
[{:keys [pool] :as cfg} params]
(db/with-atomic [conn pool]

View File

@@ -16,14 +16,15 @@
[app.config :as cf]
[app.db :as db]
[app.media :as media]
[app.rpc.commands.files :as files]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.files :as files]
[app.rpc.queries.projects :as projects]
[app.storage :as sto]
[app.storage.tmp :as tmp]
[app.tasks.file-gc]
[app.util.blob :as blob]
[app.util.fressian :as fres]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]
@@ -269,7 +270,7 @@
(when (not= readed# expected#)
(ex/raise :type :validation
:code :unexpected-label
:hint (format "unxpected label found: %s, expected: %s" readed# expected#)))))
:hint (format "unexpected label found: %s, expected: %s" readed# expected#)))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; API
@@ -289,9 +290,11 @@
(defn- retrieve-file
[pool file-id]
(->> (db/query pool :file {:id file-id})
(map files/decode-row)
(first)))
(with-open [conn (db/open pool)]
(binding [pmap/*load-fn* (partial files/load-pointer conn file-id)]
(some-> (db/get* conn :file {:id file-id})
(files/decode-row)
(update :data files/process-pointers deref)))))
(def ^:private sql:file-media-objects
"SELECT * FROM file_media_object WHERE id = ANY(?)")
@@ -367,7 +370,7 @@
(def ^:dynamic *state*)
(def ^:dynamic *options*)
;; --- EXPORT WRITTER
;; --- EXPORT WRITER
(defn- embed-file-assets
[data conn file-id]
@@ -397,8 +400,8 @@
form))
(process-group-of-assets [data [lib-id items]]
;; NOTE: there are a posibility that shape refers to a not
;; existing file because the file was removed. In this
;; NOTE: there is a possibility that shape refers to an
;; non-existant file because the file was removed. In this
;; case we just ignore the asset.
(if-let [lib (retrieve-file conn lib-id)]
(reduce (partial process-asset lib) data items)
@@ -434,14 +437,14 @@
:opt [::include-libraries? ::embed-assets?]))
(defn write-export!
"Do the exportation of a speficied file in custom penpot binary
"Do the exportation of a specified file in custom penpot binary
format. There are some options available for customize the output:
`::include-libraries?`: additionaly to the specified file, all the
`::include-libraries?`: additionally to the specified file, all the
linked libraries also will be included (including transitive
dependencies).
`::embed-assets?`: instead of including the libraryes, embedd in the
`::embed-assets?`: instead of including the libraries, embed in the
same file library all assets used from external libraries."
[{:keys [::include-libraries? ::embed-assets?] :as options}]
(us/assert! ::write-export-options options)
@@ -557,7 +560,7 @@
format. There are some options for customize the importation
behavior:
`::overwrite?`: if true, instead of creating new files and remaping id references,
`::overwrite?`: if true, instead of creating new files and remapping id references,
it reuses all ids and updates existing objects; defaults to `false`.
`::migrate?`: if true, applies the migration before persisting the
@@ -622,7 +625,7 @@
(l/debug :hint "update media references" ::l/async false)
(vswap! *state* update :media into (map #(update % :id lookup-index)) media')
(l/debug :hint "procesing file" :file-id file-id ::l/async false)
(l/debug :hint "processing file" :file-id file-id ::l/async false)
(let [file-id' (lookup-index file-id)
data (-> (:data file)
@@ -807,7 +810,7 @@
cs (volatile! nil)]
(try
(l/info :hint "start exportation" :export-id id)
(with-open [output (io/output-stream output)]
(with-open [^AutoCloseable output (io/output-stream output)]
(binding [*position* (atom 0)]
(write-export! (assoc cfg ::output output))))
@@ -830,7 +833,7 @@
(defn export-to-tmpfile!
[cfg]
(let [path (tmp/tempfile :prefix "penpot.export.")]
(with-open [output (io/output-stream path)]
(with-open [^AutoCloseable output (io/output-stream path)]
(export! cfg output)
path)))
@@ -842,7 +845,7 @@
(try
(l/info :hint "start importation" :import-id id)
(binding [*position* (atom 0)]
(with-open [input (io/input-stream input)]
(with-open [^AutoCloseable input (io/input-stream input)]
(read-import! (assoc cfg ::input input))))
(catch Throwable cause
@@ -870,7 +873,7 @@
{::doc/added "1.15"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id include-libraries? embed-assets?] :as params}]
(files/check-read-permissions! pool profile-id file-id)
(let [resp (reify yrs/StreamableResponseBody
(let [body (reify yrs/StreamableResponseBody
(-write-body-to-stream [_ _ output-stream]
(-> cfg
(assoc ::file-ids [file-id])
@@ -878,11 +881,8 @@
(assoc ::include-libraries? include-libraries?)
(export! output-stream))))]
(with-meta (sv/wrap nil)
{:transform-response (fn [_ response]
(-> response
(assoc :body resp)
(assoc :headers {"content-type" "application/octet-stream"})))})))
(fn [_]
(yrs/response 200 body {"content-type" "application/octet-stream"}))))
(s/def ::file ::media/upload)
(s/def ::import-binfile

View File

@@ -10,8 +10,9 @@
[app.common.geom.point :as gpt]
[app.common.spec :as us]
[app.db :as db]
[app.loggers.webhooks :as-alias webhooks]
[app.rpc.commands.files :as files]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.files :as files]
[app.rpc.queries.teams :as teams]
[app.rpc.retry :as retry]
[app.util.blob :as blob]
@@ -43,6 +44,7 @@
#(or (:file-id %) (:team-id %))))
(sv/defmethod ::get-comment-threads
{::doc/added "1.15"}
[{:keys [pool] :as cfg} params]
(with-open [conn (db/open pool)]
(retrieve-comment-threads conn params)))
@@ -245,7 +247,8 @@
(sv/defmethod ::create-comment-thread
{::retry/max-retries 3
::retry/matches retry/conflict-db-insert?
::doc/added "1.15"}
::doc/added "1.15"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id file-id share-id] :as params}]
(db/with-atomic [conn pool]
(files/check-comment-permissions! conn profile-id file-id share-id)
@@ -364,7 +367,8 @@
:opt-un [::share-id]))
(sv/defmethod ::create-comment
{::doc/added "1.15"}
{::doc/added "1.15"
::webhooks/event? true}
[{:keys [pool] :as cfg} params]
(db/with-atomic [conn pool]
(create-comment conn params)))
@@ -483,7 +487,8 @@
(s/keys :req-un [::profile-id ::id]))
(sv/defmethod ::delete-comment
{::doc/added "1.15"}
{::doc/added "1.15"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id id] :as params}]
(db/with-atomic [conn pool]
(let [comment (db/get-by-id conn :comment id {:for-update true})]
@@ -529,4 +534,3 @@
:frame-id frame-id}
{:id (:id thread)})
nil)))

View File

@@ -24,11 +24,11 @@
(sv/defmethod ::create-demo-profile
"A command that is responsible of creating a demo purpose
profile. It only works if the `demo-users` flag is inabled in the
profile. It only works if the `demo-users` flag is enabled in the
configuration."
{:auth false
::doc/added "1.15"
::doc/changes ["1.15" "This methos is migrated from mutations to commands."]}
::doc/changes ["1.15" "This method is migrated from mutations to commands."]}
[{:keys [pool] :as cfg} _]
(let [id (uuid/next)
sem (System/currentTimeMillis)

View File

@@ -6,20 +6,367 @@
(ns app.rpc.commands.files
(:require
[app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex]
[app.common.geom.shapes :as gsh]
[app.common.pages.helpers :as cph]
[app.common.pages.migrations :as pmg]
[app.common.spec :as us]
[app.common.types.file :as ctf]
[app.common.types.shape-tree :as ctt]
[app.db :as db]
[app.db.sql :as sql]
[app.loggers.webhooks :as-alias webhooks]
[app.rpc.commands.files.thumbnails :as-alias thumbs]
[app.rpc.cond :as-alias cond]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.files :as files]
[app.rpc.helpers :as rph]
[app.rpc.permissions :as perms]
[app.rpc.queries.projects :as projects]
[app.rpc.queries.share-link :refer [retrieve-share-link]]
[app.rpc.queries.teams :as teams]
[app.util.blob :as blob]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
[app.util.time :as dt]
[clojure.set :as set]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]))
;; --- FEATURES
(def supported-features
#{"storage/objects-map"
"storage/pointer-map"
"components/v2"})
(def default-features #{})
;; --- SPECS
(s/def ::features ::us/set-of-strings)
(s/def ::file-id ::us/uuid)
(s/def ::frame-id ::us/uuid)
(s/def ::id ::us/uuid)
(s/def ::is-shared ::us/boolean)
(s/def ::name ::us/string)
(s/def ::profile-id ::us/uuid)
(s/def ::project-id ::us/uuid)
(s/def ::search-term ::us/string)
(s/def ::team-id ::us/uuid)
;; --- HELPERS
(def long-cache-duration
(dt/duration {:days 7}))
(defn decode-row
[{:keys [data changes features] :as row}]
(when row
(cond-> row
features (assoc :features (db/decode-pgarray features #{}))
changes (assoc :changes (blob/decode changes))
data (assoc :data (blob/decode data)))))
;; --- FILE PERMISSIONS
(def ^:private sql:file-permissions
"select fpr.is_owner,
fpr.is_admin,
fpr.can_edit
from file_profile_rel as fpr
where fpr.file_id = ?
and fpr.profile_id = ?
union all
select tpr.is_owner,
tpr.is_admin,
tpr.can_edit
from team_profile_rel as tpr
inner join project as p on (p.team_id = tpr.team_id)
inner join file as f on (p.id = f.project_id)
where f.id = ?
and tpr.profile_id = ?
union all
select ppr.is_owner,
ppr.is_admin,
ppr.can_edit
from project_profile_rel as ppr
inner join file as f on (f.project_id = ppr.project_id)
where f.id = ?
and ppr.profile_id = ?")
(defn get-file-permissions
[conn profile-id file-id]
(when (and profile-id file-id)
(db/exec! conn [sql:file-permissions
file-id profile-id
file-id profile-id
file-id profile-id])))
(defn get-permissions
([conn profile-id file-id]
(let [rows (get-file-permissions conn profile-id file-id)
is-owner (boolean (some :is-owner rows))
is-admin (boolean (some :is-admin rows))
can-edit (boolean (some :can-edit rows))]
(when (seq rows)
{:type :membership
:is-owner is-owner
:is-admin (or is-owner is-admin)
:can-edit (or is-owner is-admin can-edit)
:can-read true
:is-logged (some? profile-id)})))
([conn profile-id file-id share-id]
(let [perms (get-permissions conn profile-id file-id)
ldata (retrieve-share-link conn file-id share-id)]
;; NOTE: in a future when share-link becomes more powerful and
;; will allow us specify which parts of the app is available, we
;; will probably need to tweak this function in order to expose
;; this flags to the frontend.
(cond
(some? perms) perms
(some? ldata) {:type :share-link
:can-read true
:pages (:pages ldata)
:is-logged (some? profile-id)
:who-comment (:who-comment ldata)
:who-inspect (:who-inspect ldata)}))))
(def has-edit-permissions?
(perms/make-edition-predicate-fn get-permissions))
(def has-read-permissions?
(perms/make-read-predicate-fn get-permissions))
(def has-comment-permissions?
(perms/make-comment-predicate-fn get-permissions))
(def check-edition-permissions!
(perms/make-check-fn has-edit-permissions?))
(def check-read-permissions!
(perms/make-check-fn has-read-permissions?))
;; A user has comment permissions if she has read permissions, or comment permissions
(defn check-comment-permissions!
[conn profile-id file-id share-id]
(let [can-read (has-read-permissions? conn profile-id file-id)
can-comment (has-comment-permissions? conn profile-id file-id share-id)]
(when-not (or can-read can-comment)
(ex/raise :type :not-found
:code :object-not-found
:hint "not found"))))
;; --- HELPERS
(defn get-team-id
[conn project-id]
(:team-id (db/get-by-id conn :project project-id {:columns [:team-id]})))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; FEATURES: pointer-map
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn check-features-compatibility!
[features]
(let [not-supported (set/difference features supported-features)]
(when (seq not-supported)
(ex/raise :type :restriction
:code :features-not-supported
:feature (first not-supported)
:hint (format "features %s not supported" (str/join "," not-supported))))
features))
(defn load-pointer
[conn file-id id]
(let [row (db/get conn :file-data-fragment
{:id id :file-id file-id}
{:columns [:content]
:check-deleted? false})]
(blob/decode (:content row))))
(defn persist-pointers!
[conn file-id]
(doseq [[id item] @pmap/*tracked*]
(when (pmap/modified? item)
(let [content (-> item deref blob/encode)]
(db/insert! conn :file-data-fragment
{:id id
:file-id file-id
:content content})))))
(defn process-pointers
[file update-fn]
(update file :data (fn resolve-fn [data]
(cond-> data
(contains? data :pages-index)
(update :pages-index resolve-fn)
:always
(update-vals (fn [val]
(if (pmap/pointer-map? val)
(update-fn val)
val)))))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; QUERY COMMANDS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; --- Query: File Libraries used by a File
(defn handle-file-features
[{:keys [features] :as file} client-features]
(when (and (contains? features "components/v2")
(not (contains? client-features "components/v2")))
(ex/raise :type :restriction
:code :feature-mismatch
:feature "components/v2"
:hint "file has 'components/v2' feature enabled but frontend didn't specifies it"))
(declare retrieve-has-file-libraries)
(cond-> file
(and (contains? client-features "components/v2")
(not (contains? features "components/v2")))
(update :data ctf/migrate-to-components-v2)
(and (contains? features "storage/pointer-map")
(not (contains? client-features "storage/pointer-map")))
(process-pointers deref)))
;; --- COMMAND QUERY: get-file (by id)
(defn get-file
[conn id client-features]
;; here we check if client requested features are supported
(check-features-compatibility! client-features)
(binding [pmap/*load-fn* (partial load-pointer conn id)]
(-> (db/get-by-id conn :file id)
(decode-row)
(pmg/migrate-file)
(handle-file-features client-features))))
(defn get-minimal-file
[{:keys [pool] :as cfg} id]
(db/get pool :file {:id id} {:columns [:id :modified-at :revn]}))
(defn get-file-etag
[{:keys [modified-at revn]}]
(str (dt/format-instant modified-at :iso) "-" revn))
(s/def ::get-file
(s/keys :req-un [::profile-id ::id]
:opt-un [::features]))
(sv/defmethod ::get-file
"Retrieve a file by its ID. Only authenticated users."
{::doc/added "1.17"
::cond/get-object #(get-minimal-file %1 (:id %2))
::cond/key-fn get-file-etag}
[{:keys [pool] :as cfg} {:keys [profile-id id features] :as params}]
(with-open [conn (db/open pool)]
(let [perms (get-permissions conn profile-id id)]
(check-read-permissions! perms)
(let [file (-> (get-file conn id features)
(assoc :permissions perms))]
(vary-meta file assoc ::cond/key (get-file-etag file))))))
;; --- COMMAND QUERY: get-file-fragment (by id)
(defn- get-file-fragment
[conn file-id fragment-id]
(some-> (db/get conn :file-data-fragment {:file-id file-id :id fragment-id})
(update :content blob/decode)))
(s/def ::share-id ::us/uuid)
(s/def ::fragment-id ::us/uuid)
(s/def ::get-file-fragment
(s/keys :req-un [::file-id ::fragment-id]
:opt-un [::share-id ::profile-id]))
(sv/defmethod ::get-file-fragment
"Retrieve a file by its ID. Only authenticated users."
{::doc/added "1.17"
:auth false}
[{:keys [pool] :as cfg} {:keys [profile-id file-id fragment-id share-id] :as params}]
(with-open [conn (db/open pool)]
(let [perms (get-permissions conn profile-id file-id share-id)]
(check-read-permissions! perms)
(-> (get-file-fragment conn file-id fragment-id)
(rph/with-http-cache long-cache-duration)))))
;; --- COMMAND QUERY: get-file-object-thumbnails
(defn get-object-thumbnails
([conn file-id]
(let [sql (str/concat
"select object_id, data "
" from file_object_thumbnail"
" where file_id=?")]
(->> (db/exec! conn [sql file-id])
(d/index-by :object-id :data))))
([conn file-id object-ids]
(let [sql (str/concat
"select object_id, data "
" from file_object_thumbnail"
" where file_id=? and object_id = ANY(?)")
ids (db/create-array conn "text" (seq object-ids))]
(->> (db/exec! conn [sql file-id ids])
(d/index-by :object-id :data)))))
(s/def ::get-file-object-thumbnails
(s/keys :req-un [::profile-id ::file-id]))
(sv/defmethod ::get-file-object-thumbnails
"Retrieve a file object thumbnails."
{::doc/added "1.17"
::cond/get-object #(get-minimal-file %1 (:file-id %2))
::cond/reuse-key? true
::cond/key-fn get-file-etag}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(get-object-thumbnails conn file-id)))
;; --- COMMAND QUERY: get-project-files
(def ^:private sql:project-files
"select f.id,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.revn,
f.is_shared
from file as f
where f.project_id = ?
and f.deleted_at is null
order by f.modified_at desc")
(s/def ::get-project-files
(s/keys :req-un [::profile-id ::project-id]))
(defn get-project-files
[conn project-id]
(db/exec! conn [sql:project-files project-id]))
(sv/defmethod ::get-project-files
"Get all files for the specified project."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(with-open [conn (db/open pool)]
(projects/check-read-permissions! conn profile-id project-id)
(get-project-files conn project-id)))
;; --- COMMAND QUERY: has-file-libraries
(declare get-has-file-libraries)
(s/def ::file-id ::us/uuid)
(s/def ::profile-id ::us/uuid)
@@ -32,8 +379,8 @@
{::doc/added "1.15.1"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(with-open [conn (db/open pool)]
(files/check-read-permissions! pool profile-id file-id)
(retrieve-has-file-libraries conn params)))
(check-read-permissions! pool profile-id file-id)
(get-has-file-libraries conn params)))
(def ^:private sql:has-file-libraries
"SELECT COUNT(*) > 0 AS has_libraries
@@ -43,8 +390,599 @@
AND (fl.deleted_at IS NULL OR
fl.deleted_at > now())")
(defn- retrieve-has-file-libraries
(defn- get-has-file-libraries
[conn {:keys [file-id]}]
(let [row (db/exec-one! conn [sql:has-file-libraries file-id])]
(:has-libraries row)))
;; --- QUERY COMMAND: get-page
(defn- prune-objects
"Given the page data and the object-id returns the page data with all
other not needed objects removed from the `:objects` data
structure."
[{:keys [objects] :as page} object-id]
(let [objects (cph/get-children-with-self objects object-id)]
(assoc page :objects (d/index-by :id objects))))
(defn- prune-thumbnails
"Given the page data, removes the `:thumbnail` prop from all
shapes."
[page]
(update page :objects update-vals #(dissoc % :thumbnail)))
(defn get-page
[conn {:keys [file-id page-id object-id features]}]
(let [file (get-file conn file-id features)
page-id (or page-id (-> file :data :pages first))
page (dm/get-in file [:data :pages-index page-id])]
(cond-> (prune-thumbnails page)
(uuid? object-id)
(prune-objects object-id))))
(s/def ::page-id ::us/uuid)
(s/def ::object-id ::us/uuid)
(s/def ::get-page
(s/and
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::page-id ::object-id ::features])
(fn [obj]
(if (contains? obj :object-id)
(contains? obj :page-id)
true))))
(sv/defmethod ::get-page
"Retrieves the page data from file and returns it. If no page-id is
specified, the first page will be returned. If object-id is
specified, only that object and its children will be returned in the
page objects data structure.
If you specify the object-id, the page-id parameter becomes
mandatory.
Mainly used for rendering purposes."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(get-page conn params)))
;; --- COMMAND QUERY: get-team-shared-files
(def ^:private sql:team-shared-files
"select f.id,
f.revn,
f.data,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared
from file as f
inner join project as p on (p.id = f.project_id)
where f.is_shared = true
and f.deleted_at is null
and p.deleted_at is null
and p.team_id = ?
order by f.modified_at desc")
(defn get-team-shared-files
[conn {:keys [team-id] :as params}]
(let [assets-sample
(fn [assets limit]
(let [sorted-assets (->> (vals assets)
(sort-by #(str/lower (:name %))))]
{:count (count sorted-assets)
:sample (into [] (take limit sorted-assets))}))
library-summary
(fn [data]
{:components (assets-sample (:components data) 4)
:colors (assets-sample (:colors data) 3)
:typographies (assets-sample (:typographies data) 3)})
xform (comp
(map decode-row)
(map #(assoc % :library-summary (library-summary (:data %))))
(map #(dissoc % :data)))]
(into #{} xform (db/exec! conn [sql:team-shared-files team-id]))))
(s/def ::get-team-shared-files
(s/keys :req-un [::profile-id ::team-id]))
(sv/defmethod ::get-team-shared-files
"Get all file (libraries) for the specified team."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} params]
(with-open [conn (db/open pool)]
(get-team-shared-files conn params)))
;; --- COMMAND QUERY: get-file-libraries
(def ^:private sql:file-libraries
"WITH RECURSIVE libs AS (
SELECT fl.*, flr.synced_at
FROM file AS fl
JOIN file_library_rel AS flr ON (flr.library_file_id = fl.id)
WHERE flr.file_id = ?::uuid
UNION
SELECT fl.*, flr.synced_at
FROM file AS fl
JOIN file_library_rel AS flr ON (flr.library_file_id = fl.id)
JOIN libs AS l ON (flr.file_id = l.id)
)
SELECT l.id,
l.data,
l.features,
l.project_id,
l.created_at,
l.modified_at,
l.deleted_at,
l.name,
l.revn,
l.synced_at
FROM libs AS l
WHERE l.deleted_at IS NULL OR l.deleted_at > now();")
(defn get-file-libraries
[conn file-id client-features]
(check-features-compatibility! client-features)
(->> (db/exec! conn [sql:file-libraries file-id])
(mapv (fn [{:keys [id] :as row}]
(binding [pmap/*load-fn* (partial load-pointer conn id)]
(-> (decode-row row)
(assoc :is-indirect false)
(update :data dissoc :pages-index)
(handle-file-features client-features)))))))
(s/def ::get-file-libraries
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::features]))
(sv/defmethod ::get-file-libraries
"Get libraries used by the specified file."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id features] :as params}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(get-file-libraries conn file-id features)))
;; --- COMMAND QUERY: Files that use this File library
(def ^:private sql:library-using-files
"SELECT f.id,
f.name
FROM file_library_rel AS flr
JOIN file AS f ON (f.id = flr.file_id)
WHERE flr.library_file_id = ?
AND (f.deleted_at IS NULL OR f.deleted_at > now())")
(defn get-library-file-references
[conn file-id]
(db/exec! conn [sql:library-using-files file-id]))
(s/def ::get-library-file-references
(s/keys :req-un [::profile-id ::file-id]))
(sv/defmethod ::get-library-file-references
"Returns all the file references that use specified file (library) id."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(get-library-file-references conn file-id)))
;; --- COMMAND QUERY: get-team-recent-files
(def sql:team-recent-files
"with recent_files as (
select f.id,
f.revn,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared,
row_number() over w as row_num
from file as f
join project as p on (p.id = f.project_id)
where p.team_id = ?
and p.deleted_at is null
and f.deleted_at is null
window w as (partition by f.project_id order by f.modified_at desc)
order by f.modified_at desc
)
select * from recent_files where row_num <= 10;")
(defn get-team-recent-files
[conn team-id]
(db/exec! conn [sql:team-recent-files team-id]))
(s/def ::get-team-recent-files
(s/keys :req-un [::profile-id ::team-id]))
(sv/defmethod ::get-team-recent-files
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id team-id]}]
(with-open [conn (db/open pool)]
(teams/check-read-permissions! conn profile-id team-id)
(get-team-recent-files conn team-id)))
;; --- COMMAND QUERY: get-file-thumbnail
(defn get-file-thumbnail
[conn file-id revn]
(let [sql (sql/select :file-thumbnail
(cond-> {:file-id file-id}
revn (assoc :revn revn))
{:limit 1
:order-by [[:revn :desc]]})
row (db/exec-one! conn sql)]
(when-not row
(ex/raise :type :not-found
:code :file-thumbnail-not-found))
{:data (:data row)
:props (some-> (:props row) db/decode-transit-pgobject)
:revn (:revn row)
:file-id (:file-id row)}))
(s/def ::revn ::us/integer)
(s/def ::get-file-thumbnail
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::revn]))
(sv/defmethod ::get-file-thumbnail
{::doc/added "1.17"}
[{:keys [pool]} {:keys [profile-id file-id revn]}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(-> (get-file-thumbnail conn file-id revn)
(rph/with-http-cache long-cache-duration))))
;; --- COMMAND QUERY: get-file-data-for-thumbnail
(defn get-file-data-for-thumbnail
[conn {:keys [data id] :as file}]
(letfn [;; function responsible on finding the frame marked to be
;; used as thumbnail; the returned frame always have
;; the :page-id set to the page that it belongs.
(get-thumbnail-frame [data]
(d/seek :use-for-thumbnail?
(for [page (-> data :pages-index vals)
frame (-> page :objects ctt/get-frames)]
(assoc frame :page-id (:id page)))))
;; function responsible to filter objects data structure of
;; all unneeded shapes if a concrete frame is provided. If no
;; frame, the objects is returned untouched.
(filter-objects [objects frame-id]
(d/index-by :id (cph/get-children-with-self objects frame-id)))
;; function responsible of assoc available thumbnails
;; to frames and remove all children shapes from objects if
;; thumbnails is available
(assoc-thumbnails [objects page-id thumbnails]
(loop [objects objects
frames (filter cph/frame-shape? (vals objects))]
(if-let [frame (-> frames first)]
(let [frame-id (:id frame)
object-id (str page-id frame-id)
frame (if-let [thumb (get thumbnails object-id)]
(assoc frame :thumbnail thumb :shapes [])
(dissoc frame :thumbnail))
children-ids
(cph/get-children-ids objects frame-id)
bounds
(when (:show-content frame)
(gsh/selection-rect (concat [frame] (->> children-ids (map (d/getf objects))))))
frame
(cond-> frame
(some? bounds)
(assoc :children-bounds bounds))]
(if (:thumbnail frame)
(recur (-> objects
(assoc frame-id frame)
(d/without-keys children-ids))
(rest frames))
(recur (assoc objects frame-id frame)
(rest frames))))
objects)))]
(let [frame (get-thumbnail-frame data)
frame-id (:id frame)
page-id (or (:page-id frame)
(-> data :pages first))
page (dm/get-in data [:pages-index page-id])
frame-ids (if (some? frame) (list frame-id) (map :id (ctt/get-frames (:objects page))))
obj-ids (map #(str page-id %) frame-ids)
thumbs (get-object-thumbnails conn id obj-ids)]
(cond-> page
;; If we have frame, we need to specify it on the page level
;; and remove the all other unrelated objects.
(some? frame-id)
(-> (assoc :thumbnail-frame-id frame-id)
(update :objects filter-objects frame-id))
;; Assoc the available thumbnails and prune not visible shapes
;; for avoid transfer unnecessary data.
:always
(update :objects assoc-thumbnails page-id thumbs)))))
(s/def ::get-file-data-for-thumbnail
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::features]))
(sv/defmethod ::get-file-data-for-thumbnail
"Retrieves the data for generate the thumbnail of the file. Used
mainly for render thumbnails on dashboard."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id features] :as props}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id file-id)
(let [file (get-file conn file-id features)]
{:file-id file-id
:revn (:revn file)
:page (get-file-data-for-thumbnail conn file)})))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; MUTATION COMMANDS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; --- MUTATION COMMAND: rename-file
(defn rename-file
[conn {:keys [id name] :as params}]
(-> (db/update! conn :file
{:name name
:modified-at (dt/now)}
{:id id})
(select-keys [:id :name :created-at :modified-at])))
(s/def ::rename-file
(s/keys :req-un [::profile-id ::name ::id]))
(sv/defmethod ::rename-file
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id id)
(rename-file conn params)))
;; --- MUTATION COMMAND: set-file-shared
(defn unlink-files
[conn {:keys [id] :as params}]
(db/delete! conn :file-library-rel {:library-file-id id}))
(defn set-file-shared
[conn {:keys [id is-shared] :as params}]
(-> (db/update! conn :file
{:is-shared is-shared}
{:id id})
(select-keys [:id :name :is-shared])))
(defn absorb-library
"Find all files using a shared library, and absorb all library assets
into the file local libraries"
[conn {:keys [id] :as params}]
(let [library (db/get-by-id conn :file id)]
(when (:is-shared library)
(let [ldata (-> library decode-row pmg/migrate-file :data)]
(->> (db/query conn :file-library-rel {:library-file-id id})
(map :file-id)
(keep #(db/get-by-id conn :file % {:check-deleted? false}))
(map decode-row)
(map pmg/migrate-file)
(run! (fn [{:keys [id data revn] :as file}]
(let [data (ctf/absorb-assets data ldata)]
(db/update! conn :file
{:revn (inc revn)
:data (blob/encode data)
:modified-at (dt/now)}
{:id id})))))))))
(s/def ::set-file-shared
(s/keys :req-un [::profile-id ::id ::is-shared]))
(sv/defmethod ::set-file-shared
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id profile-id is-shared] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id id)
(when-not is-shared
(absorb-library conn params)
(unlink-files conn params))
(set-file-shared conn params)))
;; --- MUTATION COMMAND: delete-file
(defn mark-file-deleted
[conn {:keys [id] :as params}]
(db/update! conn :file
{:deleted-at (dt/now)}
{:id id})
nil)
(s/def ::delete-file
(s/keys :req-un [::id ::profile-id]))
(sv/defmethod ::delete-file
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id id)
(absorb-library conn params)
(mark-file-deleted conn params)))
;; --- MUTATION COMMAND: link-file-to-library
(def sql:link-file-to-library
"insert into file_library_rel (file_id, library_file_id)
values (?, ?)
on conflict do nothing;")
(defn link-file-to-library
[conn {:keys [file-id library-id] :as params}]
(db/exec-one! conn [sql:link-file-to-library file-id library-id]))
(s/def ::link-file-to-library
(s/keys :req-un [::profile-id ::file-id ::library-id]))
(sv/defmethod ::link-file-to-library
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id file-id library-id] :as params}]
(when (= file-id library-id)
(ex/raise :type :validation
:code :invalid-library
:hint "A file cannot be linked to itself"))
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(check-edition-permissions! conn profile-id library-id)
(link-file-to-library conn params)))
;; --- MUTATION COMMAND: unlink-file-from-library
(defn unlink-file-from-library
[conn {:keys [file-id library-id] :as params}]
(db/delete! conn :file-library-rel
{:file-id file-id
:library-file-id library-id}))
(s/def ::unlink-file-from-library
(s/keys :req-un [::profile-id ::file-id ::library-id]))
(sv/defmethod ::unlink-file-from-library
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(unlink-file-from-library conn params)))
;; --- MUTATION COMMAND: update-sync
(defn update-sync
[conn {:keys [file-id library-id] :as params}]
(db/update! conn :file-library-rel
{:synced-at (dt/now)}
{:file-id file-id
:library-file-id library-id}))
(s/def ::update-file-library-sync-status
(s/keys :req-un [::profile-id ::file-id ::library-id]))
;; TODO: improve naming
(sv/defmethod ::update-file-library-sync-status
"Update the synchronization statos of a file->library link"
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(update-sync conn params)))
;; --- MUTATION COMMAND: ignore-sync
(defn ignore-sync
[conn {:keys [file-id date] :as params}]
(db/update! conn :file
{:ignore-sync-until date}
{:id file-id}))
(s/def ::ignore-file-library-sync-status
(s/keys :req-un [::profile-id ::file-id ::date]))
;; TODO: improve naming
(sv/defmethod ::ignore-file-library-sync-status
"Ignore updates in linked files"
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(ignore-sync conn params)))
;; --- MUTATION COMMAND: upsert-file-object-thumbnail
(def sql:upsert-object-thumbnail
"insert into file_object_thumbnail(file_id, object_id, data)
values (?, ?, ?)
on conflict(file_id, object_id) do
update set data = ?;")
(defn upsert-file-object-thumbnail!
[conn {:keys [file-id object-id data]}]
(if data
(db/exec-one! conn [sql:upsert-object-thumbnail file-id object-id data data])
(db/delete! conn :file-object-thumbnail {:file-id file-id :object-id object-id})))
(s/def ::data (s/nilable ::us/string))
(s/def ::thumbs/object-id ::us/string)
(s/def ::upsert-file-object-thumbnail
(s/keys :req-un [::profile-id ::file-id ::thumbs/object-id ::data]))
(sv/defmethod ::upsert-file-object-thumbnail
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(upsert-file-object-thumbnail! conn params)
nil))
;; --- MUTATION COMMAND: upsert-file-thumbnail
(def sql:upsert-file-thumbnail
"insert into file_thumbnail (file_id, revn, data, props)
values (?, ?, ?, ?::jsonb)
on conflict(file_id, revn) do
update set data = ?, props=?, updated_at=now();")
(defn upsert-file-thumbnail
[conn {:keys [file-id revn data props]}]
(let [props (db/tjson (or props {}))]
(db/exec-one! conn [sql:upsert-file-thumbnail
file-id revn data props data props])))
(s/def ::revn ::us/integer)
(s/def ::props map?)
(s/def ::upsert-file-thumbnail
(s/keys :req-un [::profile-id ::file-id ::revn ::data ::props]))
(sv/defmethod ::upsert-file-thumbnail
"Creates or updates the file thumbnail. Mainly used for paint the
grid thumbnails."
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(check-edition-permissions! conn profile-id file-id)
(upsert-file-thumbnail conn params)
nil))

View File

@@ -0,0 +1,87 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.files.create
(:require
[app.common.data :as d]
[app.common.files.features :as ffeat]
[app.common.types.file :as ctf]
[app.common.uuid :as uuid]
[app.db :as db]
[app.loggers.audit :as-alias audit]
[app.loggers.webhooks :as-alias webhooks]
[app.rpc.commands.files :as files]
[app.rpc.doc :as-alias doc]
[app.rpc.permissions :as perms]
[app.rpc.queries.projects :as proj]
[app.util.blob :as blob]
[app.util.objects-map :as omap]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
(defn create-file-role!
[conn {:keys [file-id profile-id role]}]
(let [params {:file-id file-id
:profile-id profile-id}]
(->> (perms/assign-role-flags params role)
(db/insert! conn :file-profile-rel))))
(defn create-file
[conn {:keys [id name project-id is-shared data revn
modified-at deleted-at create-page
ignore-sync-until features]
:or {is-shared false revn 0 create-page true}
:as params}]
(let [id (or id (:id data) (uuid/next))
features (-> (into files/default-features features)
(files/check-features-compatibility!))
data (or data
(binding [ffeat/*current* features
ffeat/*wrap-with-objects-map-fn* (if (features "storate/objects-map") omap/wrap identity)
ffeat/*wrap-with-pointer-map-fn* (if (features "storage/pointer-map") pmap/wrap identity)]
(if create-page
(ctf/make-file-data id)
(ctf/make-file-data id nil))))
features (db/create-array conn "text" features)
file (db/insert! conn :file
(d/without-nils
{:id id
:project-id project-id
:name name
:revn revn
:is-shared is-shared
:data (blob/encode data)
:features features
:ignore-sync-until ignore-sync-until
:modified-at modified-at
:deleted-at deleted-at}))]
(->> (assoc params :file-id id :role :owner)
(create-file-role! conn))
(files/decode-row file)))
(s/def ::create-file
(s/keys :req-un [::files/profile-id
::files/name
::files/project-id]
:opt-un [::files/id
::files/is-shared
::files/features]))
(sv/defmethod ::create-file
{::doc/added "1.17"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(db/with-atomic [conn pool]
(proj/check-edition-permissions! conn profile-id project-id)
(let [team-id (files/get-team-id conn project-id)]
(-> (create-file conn params)
(vary-meta assoc ::audit/props {:team-id team-id})))))

View File

@@ -0,0 +1,106 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.files.temp
(:require
[app.common.exceptions :as ex]
[app.common.pages :as cp]
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.rpc.commands.files :as files]
[app.rpc.commands.files.create :as files.create]
[app.rpc.commands.files.update :as files.update]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.projects :as proj]
[app.util.blob :as blob]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]))
;; --- MUTATION COMMAND: create-temp-file
(s/def ::create-page ::us/boolean)
(s/def ::create-temp-file
(s/keys :req-un [::files/profile-id
::files/name
::files/project-id]
:opt-un [::files/id
::files/is-shared
::files/features
::create-page]))
(sv/defmethod ::create-temp-file
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(db/with-atomic [conn pool]
(proj/check-edition-permissions! conn profile-id project-id)
(files.create/create-file conn (assoc params :deleted-at (dt/in-future {:days 1})))))
;; --- MUTATION COMMAND: update-temp-file
(defn update-temp-file
[conn {:keys [profile-id session-id id revn changes] :as params}]
(db/insert! conn :file-change
{:id (uuid/next)
:session-id session-id
:profile-id profile-id
:created-at (dt/now)
:file-id id
:revn revn
:data nil
:changes (blob/encode changes)}))
(s/def ::update-temp-file
(s/keys :req-un [::files.update/changes
::files.update/revn
::files.update/session-id
::files/id]))
(sv/defmethod ::update-temp-file
{::doc/added "1.17"}
[{:keys [pool] :as cfg} params]
(db/with-atomic [conn pool]
(update-temp-file conn params)
nil))
;; --- MUTATION COMMAND: persist-temp-file
(defn persist-temp-file
[conn {:keys [id] :as params}]
(let [file (db/get-by-id conn :file id)
revs (db/query conn :file-change
{:file-id id}
{:order-by [[:revn :asc]]})
revn (count revs)]
(when (nil? (:deleted-at file))
(ex/raise :type :validation
:code :cant-persist-already-persisted-file))
(loop [revs (seq revs)
data (blob/decode (:data file))]
(if-let [rev (first revs)]
(recur (rest revs)
(->> rev :changes blob/decode (cp/process-changes data)))
(db/update! conn :file
{:deleted-at nil
:revn revn
:data (blob/encode data)}
{:id id})))
nil))
(s/def ::persist-temp-file
(s/keys :req-un [::files/id
::files/profile-id]))
(sv/defmethod ::persist-temp-file
{::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(persist-temp-file conn params)))

View File

@@ -0,0 +1,304 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.files.update
(:require
[app.common.exceptions :as ex]
[app.common.files.features :as ffeat]
[app.common.logging :as l]
[app.common.pages :as cp]
[app.common.pages.migrations :as pmg]
[app.common.spec :as us]
[app.common.types.file :as ctf]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.loggers.audit :as audit]
[app.loggers.webhooks :as-alias webhooks]
[app.metrics :as mtx]
[app.msgbus :as mbus]
[app.rpc.climit :as-alias climit]
[app.rpc.commands.files :as files]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.util.blob :as blob]
[app.util.objects-map :as omap]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]))
;; --- SPECS
(s/def ::changes
(s/coll-of map? :kind vector?))
(s/def ::hint-origin ::us/keyword)
(s/def ::hint-events
(s/every ::us/keyword :kind vector?))
(s/def ::change-with-metadata
(s/keys :req-un [::changes]
:opt-un [::hint-origin
::hint-events]))
(s/def ::changes-with-metadata
(s/every ::change-with-metadata :kind vector?))
(s/def ::session-id ::us/uuid)
(s/def ::revn ::us/integer)
(s/def ::update-file
(s/and
(s/keys :req-un [::files/id ::files/profile-id ::session-id ::revn]
:opt-un [::changes ::changes-with-metadata ::features])
(fn [o]
(or (contains? o :changes)
(contains? o :changes-with-metadata)))))
;; --- HELPERS
;; File changes that affect to the library, and must be notified
;; to all clients using it.
(def ^:private library-change-types
#{:add-color :mod-color :del-color
:add-media :mod-media :del-media
:add-component :mod-component :del-component
:add-typography :mod-typography :del-typography})
(def ^:private file-change-types
#{:add-obj :mod-obj :del-obj
:reg-objects :mov-objects})
(defn- library-change?
[{:keys [type] :as change}]
(or (contains? library-change-types type)
(and (contains? file-change-types type)
(some? (:component-id change)))))
(def ^:private sql:get-file
"SELECT f.*, p.team_id
FROM file AS f
JOIN project AS p ON (p.id = f.project_id)
WHERE f.id = ?
AND (f.deleted_at IS NULL OR
f.deleted_at > now())
FOR KEY SHARE")
(defn get-file
[conn id]
(let [file (db/exec-one! conn [sql:get-file id])]
(when-not file
(ex/raise :type :not-found
:code :object-not-found
:hint (format "file with id '%s' does not exists" id)))
(update file :features db/decode-pgarray #{})))
(defn- wrap-with-pointer-map-context
[f]
(fn [{:keys [conn] :as cfg} {:keys [id] :as file}]
(binding [pmap/*tracked* (atom {})
pmap/*load-fn* (partial files/load-pointer conn id)
ffeat/*wrap-with-pointer-map-fn* pmap/wrap]
(let [result (f cfg file)]
(files/persist-pointers! conn id)
result))))
(defn- wrap-with-objects-map-context
[f]
(fn [cfg file]
(binding [ffeat/*wrap-with-objects-map-fn* omap/wrap]
(f cfg file))))
(declare get-lagged-changes)
(declare send-notifications!)
(declare update-file)
(declare update-file*)
(declare take-snapshot?)
;; If features are specified from params and the final feature
;; set is different than the persisted one, update it on the
;; database.
(defn webhook-batch-keyfn
[props]
(str "rpc:update-file:" (:id props)))
(sv/defmethod ::update-file
{::climit/queue :update-file
::climit/key-fn :id
::webhooks/event? true
::webhooks/batch-timeout (dt/duration "2s")
::webhooks/batch-key webhook-batch-keyfn
::doc/added "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(db/xact-lock! conn id)
(let [cfg (assoc cfg :conn conn)
tpoint (dt/tpoint)]
(-> (update-file cfg params)
(rph/with-defer #(let [elapsed (tpoint)]
(l/trace :hint "update-file" :time (dt/format-duration elapsed))))))))
(defn update-file
[{:keys [conn metrics] :as cfg} {:keys [id profile-id changes changes-with-metadata] :as params}]
(let [file (get-file conn id)
features (->> (concat (:features file)
(:features params))
(into files/default-features)
(files/check-features-compatibility!))]
(files/check-edition-permissions! conn profile-id (:id file))
(binding [ffeat/*current* features
ffeat/*previous* (:features file)]
(let [update-fn (cond-> update-file*
(contains? features "storage/pointer-map")
(wrap-with-pointer-map-context)
(contains? features "storage/objects-map")
(wrap-with-objects-map-context))
file (assoc file :features features)
changes (if changes-with-metadata
(->> changes-with-metadata (mapcat :changes) vec)
(vec changes))
params (assoc params :file file :changes changes)]
(mtx/run! metrics {:id :update-file-changes :inc (count changes)})
(when (not= features (:features file))
(let [features (db/create-array conn "text" features)]
(db/update! conn :file
{:features features}
{:id id})))
(-> (update-fn cfg params)
(vary-meta assoc ::audit/replace-props
{:id (:id file)
:name (:name file)
:features (:features file)
:project-id (:project-id file)
:team-id (:team-id file)}))))))
(defn- update-file*
[{:keys [conn] :as cfg} {:keys [file changes session-id profile-id] :as params}]
(when (> (:revn params)
(:revn file))
(ex/raise :type :validation
:code :revn-conflict
:hint "The incoming revision number is greater that stored version."
:context {:incoming-revn (:revn params)
:stored-revn (:revn file)}))
(let [ts (dt/now)
file (-> file
(update :revn inc)
(update :data (fn [data]
(cond-> data
:always
(-> (blob/decode)
(assoc :id (:id file))
(pmg/migrate-data))
(and (contains? ffeat/*current* "components/v2")
(not (contains? ffeat/*previous* "components/v2")))
(ctf/migrate-to-components-v2)
:always
(-> (cp/process-changes changes)
(blob/encode))))))]
(db/insert! conn :file-change
{:id (uuid/next)
:session-id session-id
:profile-id profile-id
:created-at ts
:file-id (:id file)
:revn (:revn file)
:features (db/create-array conn "text" (:features file))
:data (when (take-snapshot? file)
(:data file))
:changes (blob/encode changes)})
(db/update! conn :file
{:revn (:revn file)
:data (:data file)
:data-backend nil
:modified-at ts
:has-media-trimmed false}
{:id (:id file)})
(db/update! conn :project
{:modified-at ts}
{:id (:project-id file)})
(let [params (assoc params :file file)]
;; Send asynchronous notifications
(send-notifications! cfg params)
;; Retrieve and return lagged data
(get-lagged-changes conn params))))
(defn- take-snapshot?
"Defines the rule when file `data` snapshot should be saved."
[{:keys [revn modified-at] :as file}]
(let [freq (or (cf/get :file-change-snapshot-every) 20)
timeout (or (cf/get :file-change-snapshot-timeout)
(dt/duration {:hours 1}))]
(or (= 1 freq)
(zero? (mod revn freq))
(> (inst-ms (dt/diff modified-at (dt/now)))
(inst-ms timeout)))))
(def ^:private
sql:lagged-changes
"select s.id, s.revn, s.file_id,
s.session_id, s.changes
from file_change as s
where s.file_id = ?
and s.revn > ?
order by s.created_at asc")
(defn- get-lagged-changes
[conn params]
(->> (db/exec! conn [sql:lagged-changes (:id params) (:revn params)])
(into [] (comp (map files/decode-row)
(map (fn [row]
(cond-> row
(= (:revn row) (:revn (:file params)))
(assoc :changes []))))))))
(defn- send-notifications!
[{:keys [conn] :as cfg} {:keys [file changes session-id] :as params}]
(let [lchanges (filter library-change? changes)
msgbus (:msgbus cfg)]
;; Asynchronously publish message to the msgbus
(mbus/pub! msgbus
:topic (:id file)
:message {:type :file-change
:profile-id (:profile-id params)
:file-id (:id file)
:session-id (:session-id params)
:revn (:revn file)
:changes changes})
(when (and (:is-shared file) (seq lchanges))
(let [team-id (or (:team-id file)
(files/get-team-id conn (:project-id file)))]
;; Asynchronously publish message to the msgbus
(mbus/pub! msgbus
:topic team-id
:message {:type :library-change
:profile-id (:profile-id params)
:file-id (:id file)
:session-id session-id
:revn (:revn file)
:modified-at (dt/now)
:changes lchanges})))))

View File

@@ -10,9 +10,11 @@
[app.common.exceptions :as ex]
[app.common.spec :as us]
[app.db :as db]
[app.http.session :as session]
[app.loggers.audit :as-alias audit]
[app.rpc.commands.auth :as cmd.auth]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.queries.profile :as profile]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
@@ -61,15 +63,16 @@
:member-id (:id profile)
:member-email (:email profile))
token (tokens :generate claims)]
(with-meta {:invitation-token token}
{:transform-response ((:create session) (:id profile))
::audit/props (:props profile)
::audit/profile-id (:id profile)}))
(with-meta profile
{:transform-response ((:create session) (:id profile))
::audit/props (:props profile)
::audit/profile-id (:id profile)})))))
(-> {:invitation-token token}
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/props (:props profile)
::audit/profile-id (:id profile)})))
(-> profile
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/props (:props profile)
::audit/profile-id (:id profile)}))))))
(defn- login-or-register
[{:keys [pool] :as cfg} info]

View File

@@ -14,11 +14,13 @@
[app.common.uuid :as uuid]
[app.db :as db]
[app.rpc.commands.binfile :as binfile]
[app.rpc.commands.files :as files]
[app.rpc.doc :as-alias doc]
[app.rpc.mutations.projects :refer [create-project-role create-project]]
[app.rpc.queries.projects :as proj]
[app.rpc.queries.teams :as teams]
[app.util.blob :as blob]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]
@@ -53,7 +55,7 @@
(assoc key (get index (get item key) (get item key)))))
(defn- process-file
[file index]
[conn {:keys [id] :as file} index]
(letfn [(process-form [form]
(cond-> form
;; Relink library items
@@ -97,18 +99,25 @@
res)))
media
media))]
(update file :data
(fn [data]
(-> data
(blob/decode)
(assoc :id (:id file))
(pmg/migrate-data)
(update :pages-index relink-shapes)
(update :components relink-shapes)
(update :media relink-media)
(d/without-nils)
(blob/encode))))))
(-> file
(update :id #(get index %))
(update :data
(fn [data]
(binding [pmap/*load-fn* (partial files/load-pointer conn id)
pmap/*tracked* (atom {})]
(let [file-id (get index id)
data (-> data
(blob/decode)
(assoc :id file-id)
(pmg/migrate-data)
(update :pages-index relink-shapes)
(update :components relink-shapes)
(update :media relink-media)
(d/without-nils)
(files/process-pointers pmap/clone)
(blob/encode))]
(files/persist-pointers! conn file-id)
data)))))))
(def sql:retrieve-used-libraries
"select flr.*
@@ -166,9 +175,9 @@
file (-> file
(assoc :created-at now)
(assoc :modified-at now)
(assoc :ignore-sync-until ignore)
(update :id #(get index %))
(process-file index))]
(assoc :ignore-sync-until ignore))
file (process-file conn file index)]
(db/insert! conn :file file)
(db/insert! conn :file-profile-rel
@@ -194,7 +203,8 @@
(proj/check-edition-permissions! conn profile-id (:project-id file))
(db/exec-one! conn ["SET CONSTRAINTS ALL DEFERRED"])
(-> (duplicate-file* conn params {:reset-shared-flag true})
(update :data blob/decode))))
(update :data blob/decode)
(update :features db/decode-pgarray #{}))))
;; --- COMMAND: Duplicate Project

View File

@@ -0,0 +1,68 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.search
(:require
[app.common.spec :as us]
[app.db :as db]
[app.rpc.doc :as-alias doc]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
(def ^:private sql:search-files
"with projects as (
select p.*
from project as p
inner join team_profile_rel as tpr on (tpr.team_id = p.team_id)
where tpr.profile_id = ?
and p.team_id = ?
and (p.deleted_at is null or p.deleted_at > now())
and (tpr.is_admin = true or
tpr.is_owner = true or
tpr.can_edit = true)
union
select p.*
from project as p
inner join project_profile_rel as ppr on (ppr.project_id = p.id)
where ppr.profile_id = ?
and p.team_id = ?
and (p.deleted_at is null or p.deleted_at > now())
and (ppr.is_admin = true or
ppr.is_owner = true or
ppr.can_edit = true)
)
select distinct
f.id,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared
from file as f
inner join projects as pr on (f.project_id = pr.id)
where f.name ilike ('%' || ? || '%')
order by f.created_at asc")
(defn search-files
[conn {:keys [profile-id team-id search-term] :as params}]
(db/exec! conn [sql:search-files
profile-id team-id
profile-id team-id
search-term]))
(s/def ::profile-id ::us/uuid)
(s/def ::team-id ::us/uuid)
(s/def ::search-files ::us/string)
(s/def ::search-files
(s/keys :req-un [::profile-id ::team-id]
:opt-un [::search-term]))
(sv/defmethod ::search-files
{::doc/added "1.17"}
[{:keys [pool]} {:keys [search-term] :as params}]
(when search-term
(search-files pool params)))

View File

@@ -9,8 +9,10 @@
[app.common.exceptions :as ex]
[app.common.spec :as us]
[app.db :as db]
[app.http.session :as session]
[app.loggers.audit :as audit]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.mutations.teams :as teams]
[app.rpc.queries.profile :as profile]
[app.tokens :as tokens]
@@ -46,7 +48,7 @@
{:email email}
{:id profile-id})
(with-meta claims
(rph/with-meta claims
{::audit/name "update-profile-email"
::audit/props {:email email}
::audit/profile-id profile-id}))
@@ -66,11 +68,11 @@
{:is-active true}
{:id (:id profile)}))
(with-meta claims
{:transform-response ((:create session) profile-id)
::audit/name "verify-profile-email"
::audit/props (audit/profile->props profile)
::audit/profile-id (:id profile)})))
(-> claims
(rph/with-transform (session/create-fn session profile-id))
(rph/with-meta {::audit/name "verify-profile-email"
::audit/props (audit/profile->props profile)
::audit/profile-id (:id profile)}))))
(defmethod process-token :auth
[{:keys [conn] :as cfg} _params {:keys [profile-id] :as claims}]
@@ -146,14 +148,13 @@
;; proceed with accepting the invitation and joining the
;; current profile to the invited team.
(let [profile (accept-invitation cfg claims invitation profile)]
(with-meta
(assoc claims :state :created)
{::audit/name "accept-team-invitation"
::audit/props (merge
(audit/profile->props profile)
{:team-id (:team-id claims)
:role (:role claims)})
::audit/profile-id profile-id}))
(-> (assoc claims :state :created)
(rph/with-meta {::audit/name "accept-team-invitation"
::audit/props (merge
(audit/profile->props profile)
{:team-id (:team-id claims)
:role (:role claims)})
::audit/profile-id profile-id})))
(ex/raise :type :validation
:code :invalid-token
@@ -169,15 +170,14 @@
{:email member-email})
{:columns [:id :email]})]
(let [profile (accept-invitation cfg claims invitation member)]
(with-meta
(assoc claims :state :created)
{:transform-response ((:create session) (:id profile))
::audit/name "accept-team-invitation"
::audit/props (merge
(audit/profile->props profile)
{:team-id (:team-id claims)
:role (:role claims)})
::audit/profile-id member-id}))
(-> (assoc claims :state :created)
(rph/with-transform (session/create-fn session (:id profile)))
(rph/with-meta {::audit/name "accept-team-invitation"
::audit/props (merge
(audit/profile->props profile)
{:team-id (:team-id claims)
:role (:role claims)})
::audit/profile-id member-id})))
{:invitation-token token
:iss :team-invitation

View File

@@ -0,0 +1,88 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.viewer
(:require
[app.common.exceptions :as ex]
[app.db :as db]
[app.rpc.commands.comments :as comments]
[app.rpc.commands.files :as files]
[app.rpc.cond :as-alias cond]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.share-link :as slnk]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
;; --- Query: View Only Bundle
(defn- get-project
[conn id]
(db/get-by-id conn :project id {:columns [:id :name :team-id]}))
(defn- get-bundle
[conn file-id profile-id features]
(let [file (files/get-file conn file-id features)
project (get-project conn (:project-id file))
libs (files/get-file-libraries conn file-id features)
users (comments/get-file-comments-users conn file-id profile-id)
links (->> (db/query conn :share-link {:file-id file-id})
(mapv slnk/decode-share-link-row))
fonts (db/query conn :team-font-variant
{:team-id (:team-id project)
:deleted-at nil})]
{:file file
:users users
:fonts fonts
:project project
:share-links links
:libraries libs}))
(defn- remove-not-allowed-pages
[data allowed]
(-> data
(update :pages (fn [pages] (filterv #(contains? allowed %) pages)))
(update :pages-index select-keys allowed)))
(defn get-view-only-bundle
[conn {:keys [profile-id file-id share-id features] :as params}]
(let [perms (files/get-permissions conn profile-id file-id share-id)
bundle (-> (get-bundle conn file-id profile-id features)
(assoc :permissions perms))]
;; When we have neither profile nor share, we just return a not
;; found response to the user.
(when-not perms
(ex/raise :type :not-found
:code :object-not-found
:hint "object not found"))
(update bundle :file
(fn [file]
(cond-> file
(= :share-link (:type perms))
(update :data remove-not-allowed-pages (:pages perms))
:always
(update :data select-keys [:id :options :pages :pages-index]))))))
(s/def ::get-view-only-bundle
(s/keys :req-un [::files/file-id]
:opt-un [::files/profile-id
::files/share-id
::files/features]))
(sv/defmethod ::get-view-only-bundle
{:auth false
::cond/get-object #(files/get-minimal-file %1 (:file-id %2))
::cond/key-fn files/get-file-etag
::cond/reuse-key? true
::doc/added "1.17"}
[{:keys [pool]} params]
(with-open [conn (db/open pool)]
(get-view-only-bundle conn params)))

View File

@@ -0,0 +1,143 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.webhooks
(:require
[app.common.exceptions :as ex]
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.http.client :as http]
[app.loggers.webhooks :as webhooks]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.teams :refer [check-edition-permissions! check-read-permissions!]]
[app.util.services :as sv]
[app.util.time :as dt]
[app.worker :as-alias wrk]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[promesa.core :as p]))
;; --- Mutation: Create Webhook
(s/def ::profile-id ::us/uuid)
(s/def ::team-id ::us/uuid)
(s/def ::uri ::us/not-empty-string)
(s/def ::is-active ::us/boolean)
(s/def ::mtype
#{"application/json"
"application/x-www-form-urlencoded"
"application/transit+json"})
(s/def ::create-webhook
(s/keys :req-un [::profile-id ::team-id ::uri ::mtype]
:opt-un [::is-active]))
;; NOTE: for now the quote is hardcoded but this need to be solved in
;; a more universal way for handling properly object quotes
(def max-hooks-for-team 8)
(defn- validate-webhook!
[cfg whook params]
(letfn [(handle-exception [exception]
(if-let [hint (webhooks/interpret-exception exception)]
(ex/raise :type :validation
:code :webhook-validation
:hint hint)
(ex/raise :type :internal
:code :webhook-validation
:cause exception)))
(handle-response [response]
(when-let [hint (webhooks/interpret-response response)]
(ex/raise :type :validation
:code :webhook-validation
:hint hint)))]
(if (not= (:uri whook) (:uri params))
(->> (http/req! cfg {:method :head
:uri (:uri params)
:timeout (dt/duration "3s")})
(p/hmap (fn [response exception]
(if exception
(handle-exception exception)
(handle-response response)))))
(p/resolved nil))))
(defn- validate-quotes!
[{:keys [::db/pool]} {:keys [team-id]}]
(let [sql ["select count(*) as total from webhook where team_id = ?" team-id]
total (:total (db/exec-one! pool sql))]
(when (>= total max-hooks-for-team)
(ex/raise :type :restriction
:code :webhooks-quote-reached
:hint (str/ffmt "can't create more than % webhooks per team" max-hooks-for-team)))))
(defn- insert-webhook!
[{:keys [::db/pool]} {:keys [team-id uri mtype is-active] :as params}]
(db/insert! pool :webhook
{:id (uuid/next)
:team-id team-id
:uri uri
:is-active is-active
:mtype mtype}))
(defn- update-webhook!
[{:keys [::db/pool] :as cfg} {:keys [id] :as wook} {:keys [uri mtype is-active] :as params}]
(db/update! pool :webhook
{:uri uri
:is-active is-active
:mtype mtype
:error-code nil
:error-count 0}
{:id id}))
(sv/defmethod ::create-webhook
{::doc/added "1.17"}
[{:keys [::db/pool ::wrk/executor] :as cfg} {:keys [profile-id team-id] :as params}]
(check-edition-permissions! pool profile-id team-id)
(->> (validate-quotes! cfg params)
(p/fmap executor (fn [_] (validate-webhook! cfg nil params)))
(p/fmap executor (fn [_] (insert-webhook! cfg params)))))
(s/def ::update-webhook
(s/keys :req-un [::id ::uri ::mtype ::is-active]))
(sv/defmethod ::update-webhook
{::doc/added "1.17"}
[{:keys [::db/pool ::wrk/executor] :as cfg} {:keys [id profile-id] :as params}]
(let [whook (db/get pool :webhook {:id id})]
(check-edition-permissions! pool profile-id (:team-id whook))
(->> (validate-webhook! cfg whook params)
(p/fmap executor (fn [_] (update-webhook! cfg whook params))))))
(s/def ::delete-webhook
(s/keys :req-un [::profile-id ::id]))
(sv/defmethod ::delete-webhook
{::doc/added "1.17"}
[{:keys [::db/pool] :as cfg} {:keys [profile-id id]}]
(db/with-atomic [conn pool]
(let [whook (db/get conn :webhook {:id id})]
(check-edition-permissions! conn profile-id (:team-id whook))
(db/delete! conn :webhook {:id id})
nil)))
;; --- Query: Webhooks
(s/def ::team-id ::us/uuid)
(s/def ::get-webhooks
(s/keys :req-un [::profile-id ::team-id]))
(def sql:get-webhooks
"select id, uri, mtype, is_active, error_code, error_count
from webhook where team_id = ? order by uri")
(sv/defmethod ::get-webhooks
[{:keys [pool] :as cfg} {:keys [profile-id team-id]}]
(with-open [conn (db/open pool)]
(check-read-permissions! conn profile-id team-id)
(db/exec! conn [sql:get-webhooks team-id])))

View File

@@ -0,0 +1,67 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.cond
"Conditional loading middleware.
A middleware consists mainly on wrapping a RPC method with
conditional logic. It expects to to have some metadata set on the RPC
method that will enable this middleware to retrieve the necessary data
for process the conditional logic:
- `::get-object` => should be a function that retrieves the minimum version
of the object that will be used for calculate the KEY (etags in terms of
the HTTP protocol).
- `::key-fn` a function used to generate a string representation
of the object. This function can be applied to the object returned by the
`get-object` but also to the RPC return value (in case you don't provide
the return value calculated key under `::key` metadata prop.
- `::reuse-key?` enables reusing the key calculated on first time; usefull
when the target object is not retrieved on the RPC (typical on retrieving
dependent objects).
"
(:require
[app.common.logging :as l]
[app.rpc.helpers :as rph]
[app.util.services :as-alias sv]
[promesa.core :as p]
[promesa.exec :as px]
[yetti.response :as yrs]))
(def
^{:dynamic true
:doc "Runtime flag for enable/disable conditional processing of RPC methods."}
*enabled* false)
(defn- fmt-key
[s]
(when s
(str "W/\"" s "\"")))
(defn wrap
[{:keys [executor]} f {:keys [::get-object ::key-fn ::reuse-key?] :as mdata}]
(if (and (ifn? get-object) (ifn? key-fn))
(do
(l/debug :hint "instrumenting method" :service (::sv/name mdata))
(fn [cfg {:keys [::key] :as params}]
(if *enabled*
(->> (if (or key reuse-key?)
(->> (px/submit! executor (partial get-object cfg params))
(p/map key-fn)
(p/map fmt-key))
(p/resolved nil))
(p/mapcat (fn [key']
(if (and (some? key)
(= key key'))
(p/resolved (fn [_] (yrs/response 304)))
(->> (f cfg params)
(p/map (fn [result]
(->> (or (and reuse-key? key')
(-> result meta ::key fmt-key)
(-> result key-fn fmt-key))
(rph/with-header result "etag")))))))))
(f cfg params))))
f))

View File

@@ -6,11 +6,72 @@
(ns app.rpc.helpers
"General purpose RPC helpers."
(:require [app.common.data.macros :as dm]))
(:refer-clojure :exclude [with-meta])
(:require
[app.common.data.macros :as dm]
[app.http :as-alias http]
[app.rpc :as-alias rpc]))
(defn http-cache
[{:keys [max-age]}]
(fn [_ response]
(let [exp (if (integer? max-age) max-age (inst-ms max-age))
val (dm/fmt "max-age=%" (int (/ exp 1000.0)))]
(update response :headers assoc "cache-control" val))))
;; A utilty wrapper object for wrap service responses that does not
;; implements the IObj interface that make possible attach metadata to
;; it.
(deftype MetadataWrapper [obj ^:unsynchronized-mutable metadata]
clojure.lang.IDeref
(deref [_] obj)
clojure.lang.IObj
(withMeta [_ meta]
(MetadataWrapper. obj meta))
(meta [_] metadata))
(defn wrap
"Conditionally wrap a value into MetadataWrapper instance. If the
object already implements IObj interface it will be returned as is."
([] (wrap nil))
([o]
(if (instance? clojure.lang.IObj o)
o
(MetadataWrapper. o {})))
([o m]
(MetadataWrapper. o m)))
(defn wrapped?
[o]
(instance? MetadataWrapper o))
(defn unwrap
[o]
(if (wrapped? o) @o o))
(defn with-header
"Add a http header to the RPC result."
[mdw key val]
(vary-meta mdw update ::http/headers assoc key val))
(defn with-transform
"Adds a http response transform to the RPC result."
[mdw transform-fn]
(vary-meta mdw update ::rpc/response-transform-fns conj transform-fn))
(defn with-defer
"Defer execution of the function until request is finished."
[mdw hook-fn]
(vary-meta mdw update ::rpc/before-complete-fns conj hook-fn))
(defn with-meta
[mdw mdata]
(vary-meta mdw merge mdata))
(defn assoc-meta
[mdw k v]
(vary-meta mdw assoc k v))
(defn with-http-cache
[mdw max-age]
(vary-meta mdw update ::rpc/response-transform-fns conj
(fn [_ response]
(let [exp (if (integer? max-age) max-age (inst-ms max-age))
val (dm/fmt "max-age=%" (int (/ exp 1000.0)))]
(update response :headers assoc "cache-control" val)))))

View File

@@ -10,8 +10,8 @@
[app.common.spec :as us]
[app.db :as db]
[app.rpc.commands.comments :as cmd.comments]
[app.rpc.commands.files :as cmd.files]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.files :as files]
[app.rpc.retry :as retry]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
@@ -27,7 +27,7 @@
::doc/deprecated "1.15"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id share-id] :as params}]
(db/with-atomic [conn pool]
(files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.comments/create-comment-thread conn params)))
;; --- Mutation: Update Comment Thread Status
@@ -44,7 +44,7 @@
(db/with-atomic [conn pool]
(let [cthr (db/get-by-id conn :comment-thread id {:for-update true})]
(when-not cthr (ex/raise :type :not-found))
(files/check-comment-permissions! conn profile-id (:file-id cthr) share-id)
(cmd.files/check-comment-permissions! conn profile-id (:file-id cthr) share-id)
(cmd.comments/upsert-comment-thread-status! conn profile-id (:id cthr)))))
@@ -61,7 +61,7 @@
(when-not thread
(ex/raise :type :not-found))
(files/check-comment-permissions! conn profile-id (:file-id thread) share-id)
(cmd.files/check-comment-permissions! conn profile-id (:file-id thread) share-id)
(db/update! conn :comment-thread
{:is-resolved is-resolved}
{:id id})

View File

@@ -6,571 +6,231 @@
(ns app.rpc.mutations.files
(:require
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.pages :as cp]
[app.common.pages.migrations :as pmg]
[app.common.logging :as l]
[app.common.spec :as us]
[app.common.types.file :as ctf]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.loggers.audit :as audit]
[app.metrics :as mtx]
[app.msgbus :as mbus]
[app.rpc.permissions :as perms]
[app.rpc.queries.files :as files]
[app.rpc.climit :as-alias climit]
[app.rpc.commands.files :as cmd.files]
[app.rpc.commands.files.create :as cmd.files.create]
[app.rpc.commands.files.temp :as cmd.files.temp]
[app.rpc.commands.files.update :as cmd.files.update]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.queries.projects :as proj]
[app.rpc.semaphore :as rsem]
[app.storage.impl :as simpl]
[app.util.blob :as blob]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]
[promesa.core :as p]))
(declare create-file)
(declare retrieve-team-id)
;; --- Helpers & Specs
(s/def ::frame-id ::us/uuid)
(s/def ::file-id ::us/uuid)
(s/def ::id ::us/uuid)
(s/def ::name ::us/string)
(s/def ::profile-id ::us/uuid)
(s/def ::project-id ::us/uuid)
(s/def ::url ::us/url)
[clojure.spec.alpha :as s]))
;; --- Mutation: Create File
(s/def ::is-shared ::us/boolean)
(s/def ::create-file
(s/keys :req-un [::profile-id ::name ::project-id]
:opt-un [::id ::is-shared ::components-v2]))
(s/def ::create-file ::cmd.files.create/create-file)
(sv/defmethod ::create-file
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id project-id features components-v2] :as params}]
(db/with-atomic [conn pool]
(let [team-id (retrieve-team-id conn project-id)]
(proj/check-edition-permissions! conn profile-id project-id)
(with-meta
(create-file conn params)
{::audit/props {:team-id team-id}}))))
(proj/check-edition-permissions! conn profile-id project-id)
(let [team-id (cmd.files/get-team-id conn project-id)
features (cond-> (or features #{})
;; BACKWARD COMPATIBILITY with the components-v2 param
components-v2 (conj "components/v2"))
params (assoc params :features features)]
(-> (cmd.files.create/create-file conn params)
(vary-meta assoc ::audit/props {:team-id team-id})))))
(defn create-file-role
[conn {:keys [file-id profile-id role]}]
(let [params {:file-id file-id
:profile-id profile-id}]
(->> (perms/assign-role-flags params role)
(db/insert! conn :file-profile-rel))))
(defn create-file
[conn {:keys [id name project-id is-shared data revn
modified-at deleted-at ignore-sync-until
components-v2]
:or {is-shared false revn 0}
:as params}]
(let [id (or id (:id data) (uuid/next))
data (or data (ctf/make-file-data id components-v2))
file (db/insert! conn :file
(d/without-nils
{:id id
:project-id project-id
:name name
:revn revn
:is-shared is-shared
:data (blob/encode data)
:ignore-sync-until ignore-sync-until
:modified-at modified-at
:deleted-at deleted-at}))]
(->> (assoc params :file-id id :role :owner)
(create-file-role conn))
(assoc file :data data)))
;; --- Mutation: Rename File
(declare rename-file)
(s/def ::rename-file
(s/keys :req-un [::profile-id ::name ::id]))
(s/def ::rename-file ::cmd.files/rename-file)
(sv/defmethod ::rename-file
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(rename-file conn params)))
(defn- rename-file
[conn {:keys [id name] :as params}]
(db/update! conn :file
{:name name}
{:id id}))
(cmd.files/check-edition-permissions! conn profile-id id)
(cmd.files/rename-file conn params)))
;; --- Mutation: Set File shared
(declare set-file-shared)
(declare unlink-files)
(declare absorb-library)
(s/def ::set-file-shared
(s/keys :req-un [::profile-id ::id ::is-shared]))
(s/def ::set-file-shared ::cmd.files/set-file-shared)
(sv/defmethod ::set-file-shared
{::doc/added "1.2"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id is-shared] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(cmd.files/check-edition-permissions! conn profile-id id)
(when-not is-shared
(absorb-library conn params)
(unlink-files conn params))
(set-file-shared conn params)))
(defn- unlink-files
[conn {:keys [id] :as params}]
(db/delete! conn :file-library-rel {:library-file-id id}))
(defn- set-file-shared
[conn {:keys [id is-shared] :as params}]
(db/update! conn :file
{:is-shared is-shared}
{:id id}))
(cmd.files/absorb-library conn params)
(cmd.files/unlink-files conn params))
(cmd.files/set-file-shared conn params)))
;; --- Mutation: Delete File
(declare mark-file-deleted)
(s/def ::delete-file
(s/keys :req-un [::id ::profile-id]))
(s/def ::delete-file ::cmd.files/delete-file)
(sv/defmethod ::delete-file
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(absorb-library conn params)
(mark-file-deleted conn params)))
(defn mark-file-deleted
[conn {:keys [id] :as params}]
(db/update! conn :file
{:deleted-at (dt/now)}
{:id id})
nil)
(defn absorb-library
"Find all files using a shared library, and absorb all library assets
into the file local libraries"
[conn {:keys [id] :as params}]
(let [library (db/get-by-id conn :file id)]
(when (:is-shared library)
(let [ldata (-> library files/decode-row pmg/migrate-file :data)]
(->> (db/query conn :file-library-rel {:library-file-id id})
(keep (fn [{:keys [file-id]}]
(some->> (db/get-by-id conn :file file-id {:check-not-found false})
(files/decode-row)
(pmg/migrate-file))))
(run! (fn [{:keys [id data revn] :as file}]
(let [data (ctf/absorb-assets data ldata)]
(db/update! conn :file
{:revn (inc revn)
:data (blob/encode data)
:modified-at (dt/now)}
{:id id})))))))))
(cmd.files/check-edition-permissions! conn profile-id id)
(cmd.files/absorb-library conn params)
(cmd.files/mark-file-deleted conn params)))
;; --- Mutation: Link file to library
(declare link-file-to-library)
(s/def ::link-file-to-library
(s/keys :req-un [::profile-id ::file-id ::library-id]))
(s/def ::link-file-to-library ::cmd.files/link-file-to-library)
(sv/defmethod ::link-file-to-library
{::doc/added "1.3"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id library-id] :as params}]
(when (= file-id library-id)
(ex/raise :type :validation
:code :invalid-library
:hint "A file cannot be linked to itself"))
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(files/check-edition-permissions! conn profile-id library-id)
(link-file-to-library conn params)))
(def sql:link-file-to-library
"insert into file_library_rel (file_id, library_file_id)
values (?, ?)
on conflict do nothing;")
(defn- link-file-to-library
[conn {:keys [file-id library-id] :as params}]
(db/exec-one! conn [sql:link-file-to-library file-id library-id]))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/check-edition-permissions! conn profile-id library-id)
(cmd.files/link-file-to-library conn params)))
;; --- Mutation: Unlink file from library
(declare unlink-file-from-library)
(s/def ::unlink-file-from-library
(s/keys :req-un [::profile-id ::file-id ::library-id]))
(s/def ::unlink-file-from-library ::cmd.files/unlink-file-from-library)
(sv/defmethod ::unlink-file-from-library
{::doc/added "1.3"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(unlink-file-from-library conn params)))
(defn- unlink-file-from-library
[conn {:keys [file-id library-id] :as params}]
(db/delete! conn :file-library-rel
{:file-id file-id
:library-file-id library-id}))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/unlink-file-from-library conn params)))
;; --- Mutation: Update synchronization status of a link
(declare update-sync)
(s/def ::update-sync
(s/keys :req-un [::profile-id ::file-id ::library-id]))
(s/def ::update-sync ::cmd.files/update-file-library-sync-status)
(sv/defmethod ::update-sync
{::doc/added "1.10"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(update-sync conn params)))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/update-sync conn params)))
(defn- update-sync
[conn {:keys [file-id library-id] :as params}]
(db/update! conn :file-library-rel
{:synced-at (dt/now)}
{:file-id file-id
:library-file-id library-id}))
;; --- Mutation: Ignore updates in linked files
(declare ignore-sync)
(s/def ::ignore-sync
(s/keys :req-un [::profile-id ::file-id ::date]))
(s/def ::ignore-sync ::cmd.files/ignore-file-library-sync-status)
(sv/defmethod ::ignore-sync
{::doc/added "1.10"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(ignore-sync conn params)))
(defn- ignore-sync
[conn {:keys [file-id date] :as params}]
(db/update! conn :file
{:ignore-sync-until date}
{:id file-id}))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/ignore-sync conn params)))
;; --- MUTATION: update-file
;; A generic, Changes based (granular) file update method.
;; File changes that affect to the library, and must be notified
;; to all clients using it.
(defn library-change?
[change]
(or (#{:add-color :mod-color :del-color
:add-media :mod-media :del-media
:add-component :mod-component :del-component
:add-typography :mod-typography :del-typography} (:type change))
(and (#{:add-obj :mod-obj :del-obj
:reg-objects :mov-objects} (:type change))
(some? (:component-id change)))))
(declare insert-change)
(declare retrieve-lagged-changes)
(declare send-notifications)
(declare update-file)
(s/def ::changes
(s/coll-of map? :kind vector?))
(s/def ::hint-origin ::us/keyword)
(s/def ::hint-events
(s/every ::us/keyword :kind vector?))
(s/def ::change-with-metadata
(s/keys :req-un [::changes]
:opt-un [::hint-origin
::hint-events]))
(s/def ::changes-with-metadata
(s/every ::change-with-metadata :kind vector?))
(s/def ::session-id ::us/uuid)
(s/def ::revn ::us/integer)
(s/def ::components-v2 ::us/boolean)
(s/def ::update-file
(s/and
(s/keys :req-un [::id ::session-id ::profile-id ::revn]
:opt-un [::changes ::changes-with-metadata ::components-v2])
(fn [o]
(or (contains? o :changes)
(contains? o :changes-with-metadata)))))
(s/and ::cmd.files.update/update-file
(s/keys :opt-un [::components-v2])))
(sv/defmethod ::update-file
{::rsem/queue :update-file}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
{::climit/queue :update-file
::climit/key-fn :id
::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id features components-v2] :as params}]
(db/with-atomic [conn pool]
(db/xact-lock! conn id)
(let [{:keys [id] :as file} (db/get-by-id conn :file id {:for-key-share true})
team-id (retrieve-team-id conn (:project-id file))]
(files/check-edition-permissions! conn profile-id id)
(with-meta
(update-file (assoc cfg :conn conn)
(assoc params :file file))
{::audit/props {:project-id (:project-id file)
:team-id team-id}}))))
(cmd.files/check-edition-permissions! conn profile-id id)
(defn- take-snapshot?
"Defines the rule when file `data` snapshot should be saved."
[{:keys [revn modified-at] :as file}]
(let [freq (or (cf/get :file-change-snapshot-every) 20)
timeout (or (cf/get :file-change-snapshot-timeout)
(dt/duration {:hours 1}))]
(or (= 1 freq)
(zero? (mod revn freq))
(> (inst-ms (dt/diff modified-at (dt/now)))
(inst-ms timeout)))))
(let [;; BACKWARD COMPATIBILITY with the components-v2 parameter
features (cond-> (or features #{})
components-v2 (conj "components/v2"))
tpoint (dt/tpoint)
params (assoc params :features features)
cfg (assoc cfg :conn conn)]
(defn- delete-from-storage
[{:keys [storage] :as cfg} file]
(p/do
(when-let [backend (simpl/resolve-backend storage (:data-backend file))]
(simpl/del-object backend file))))
(defn- update-file
[{:keys [conn metrics] :as cfg}
{:keys [file changes changes-with-metadata session-id profile-id components-v2] :as params}]
(when (> (:revn params)
(:revn file))
(ex/raise :type :validation
:code :revn-conflict
:hint "The incoming revision number is greater that stored version."
:context {:incoming-revn (:revn params)
:stored-revn (:revn file)}))
(let [changes (if changes-with-metadata
(mapcat :changes changes-with-metadata)
changes)
changes (vec changes)
;; Trace the number of changes processed
_ (mtx/run! metrics {:id :update-file-changes :inc (count changes)})
ts (dt/now)
file (-> file
(update :revn inc)
(update :data (fn [data]
;; Trace the length of bytes of processed data
(mtx/run! metrics {:id :update-file-bytes-processed :inc (alength data)})
(cond-> data
:always
(-> (blob/decode)
(assoc :id (:id file))
(pmg/migrate-data))
components-v2
(ctf/migrate-to-components-v2)
:always
(-> (cp/process-changes changes)
(blob/encode))))))]
;; Insert change to the xlog
(db/insert! conn :file-change
{:id (uuid/next)
:session-id session-id
:profile-id profile-id
:created-at ts
:file-id (:id file)
:revn (:revn file)
:data (when (take-snapshot? file)
(:data file))
:changes (blob/encode changes)})
;; Update file
(db/update! conn :file
{:revn (:revn file)
:data (:data file)
:data-backend nil
:modified-at ts
:has-media-trimmed false}
{:id (:id file)})
;; We need to delete the data from external storage backend
(when-not (nil? (:data-backend file))
@(delete-from-storage cfg file))
(db/update! conn :project
{:modified-at ts}
{:id (:project-id file)})
(let [params (assoc params :file file :changes changes)]
;; Send asynchronous notifications
(send-notifications cfg params)
;; Retrieve and return lagged data
(retrieve-lagged-changes conn params))))
(def ^:private
sql:lagged-changes
"select s.id, s.revn, s.file_id,
s.session_id, s.changes
from file_change as s
where s.file_id = ?
and s.revn > ?
order by s.created_at asc")
(defn- retrieve-lagged-changes
[conn params]
(->> (db/exec! conn [sql:lagged-changes (:id params) (:revn params)])
(into [] (comp (map files/decode-row)
(map (fn [row]
(cond-> row
(= (:revn row) (:revn (:file params)))
(assoc :changes []))))))))
(defn- send-notifications
[{:keys [conn] :as cfg} {:keys [file changes session-id] :as params}]
(let [lchanges (filter library-change? changes)
msgbus (:msgbus cfg)]
;; Asynchronously publish message to the msgbus
(mbus/pub! msgbus
:topic (:id file)
:message {:type :file-change
:profile-id (:profile-id params)
:file-id (:id file)
:session-id (:session-id params)
:revn (:revn file)
:changes changes})
(when (and (:is-shared file) (seq lchanges))
(let [team-id (retrieve-team-id conn (:project-id file))]
;; Asynchronously publish message to the msgbus
(mbus/pub! msgbus
:topic team-id
:message {:type :library-change
:profile-id (:profile-id params)
:file-id (:id file)
:session-id session-id
:revn (:revn file)
:modified-at (dt/now)
:changes lchanges})))))
(defn- retrieve-team-id
[conn project-id]
(:team-id (db/get-by-id conn :project project-id {:columns [:team-id]})))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; TEMPORARY FILES (behaves differently)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(s/def ::create-temp-file ::create-file)
(sv/defmethod ::create-temp-file
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(db/with-atomic [conn pool]
(proj/check-edition-permissions! conn profile-id project-id)
(create-file conn (assoc params :deleted-at (dt/in-future {:days 1})))))
(s/def ::update-temp-file
(s/keys :req-un [::changes ::revn ::session-id ::id]))
(sv/defmethod ::update-temp-file
[{:keys [pool] :as cfg} {:keys [profile-id session-id id revn changes] :as params}]
(db/with-atomic [conn pool]
(db/insert! conn :file-change
{:id (uuid/next)
:session-id session-id
:profile-id profile-id
:created-at (dt/now)
:file-id id
:revn revn
:data nil
:changes (blob/encode changes)})
nil))
(s/def ::persist-temp-file
(s/keys :req-un [::id ::profile-id]))
(sv/defmethod ::persist-temp-file
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id id)
(let [file (db/get-by-id conn :file id)
revs (db/query conn :file-change
{:file-id id}
{:order-by [[:revn :asc]]})
revn (count revs)]
(when (nil? (:deleted-at file))
(ex/raise :type :validation
:code :cant-persist-already-persisted-file))
(loop [revs (seq revs)
data (blob/decode (:data file))]
(if-let [rev (first revs)]
(recur (rest revs)
(->> rev :changes blob/decode (cp/process-changes data)))
(db/update! conn :file
{:deleted-at nil
:revn revn
:data (blob/encode data)}
{:id id})))
nil)))
(-> (cmd.files.update/update-file cfg params)
(rph/with-defer #(let [elapsed (tpoint)]
(l/trace :hint "update-file" :time (dt/format-duration elapsed))))))))
;; --- Mutation: upsert object thumbnail
(def sql:upsert-object-thumbnail
"insert into file_object_thumbnail(file_id, object_id, data)
values (?, ?, ?)
on conflict(file_id, object_id) do
update set data = ?;")
(s/def ::data (s/nilable ::us/string))
(s/def ::object-id ::us/string)
(s/def ::upsert-file-object-thumbnail
(s/keys :req-un [::profile-id ::file-id ::object-id ::data]))
(s/def ::upsert-file-object-thumbnail ::cmd.files/upsert-file-object-thumbnail)
(sv/defmethod ::upsert-file-object-thumbnail
[{:keys [pool] :as cfg} {:keys [profile-id file-id object-id data]}]
{::doc/added "1.13"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(if data
(db/exec-one! conn [sql:upsert-object-thumbnail file-id object-id data data])
(db/delete! conn :file-object-thumbnail {:file-id file-id :object-id object-id}))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/upsert-file-object-thumbnail! conn params)
nil))
;; --- Mutation: upsert file thumbnail
(def sql:upsert-file-thumbnail
"insert into file_thumbnail (file_id, revn, data, props)
values (?, ?, ?, ?::jsonb)
on conflict(file_id, revn) do
update set data = ?, props=?, updated_at=now();")
(s/def ::revn ::us/integer)
(s/def ::props map?)
(s/def ::upsert-file-thumbnail
(s/keys :req-un [::profile-id ::file-id ::revn ::data ::props]))
(s/def ::upsert-file-thumbnail ::cmd.files/upsert-file-thumbnail)
(sv/defmethod ::upsert-file-thumbnail
"Creates or updates the file thumbnail. Mainly used for paint the
grid thumbnals."
[{:keys [pool] :as cfg} {:keys [profile-id file-id revn data props]}]
grid thumbnails."
{::doc/added "1.13"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(db/with-atomic [conn pool]
(files/check-edition-permissions! conn profile-id file-id)
(let [props (db/tjson (or props {}))]
(db/exec-one! conn [sql:upsert-file-thumbnail
file-id revn data props data props])
nil)))
(cmd.files/check-edition-permissions! conn profile-id file-id)
(cmd.files/upsert-file-thumbnail conn params)
nil))
;; --- MUTATION COMMAND: create-temp-file
(s/def ::create-temp-file ::cmd.files.temp/create-temp-file)
(sv/defmethod ::create-temp-file
{::doc/added "1.7"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(db/with-atomic [conn pool]
(proj/check-edition-permissions! conn profile-id project-id)
(cmd.files.create/create-file conn (assoc params :deleted-at (dt/in-future {:days 1})))))
;; --- MUTATION COMMAND: update-temp-file
(s/def ::update-temp-file ::cmd.files.temp/update-temp-file)
(sv/defmethod ::update-temp-file
{::doc/added "1.7"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} params]
(db/with-atomic [conn pool]
(cmd.files.temp/update-temp-file conn params)
nil))
;; --- MUTATION COMMAND: persist-temp-file
(s/def ::persist-temp-file ::cmd.files.temp/persist-temp-file)
(sv/defmethod ::persist-temp-file
{::doc/added "1.7"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(cmd.files/check-edition-permissions! conn profile-id id)
(cmd.files.temp/persist-temp-file conn params)))

View File

@@ -11,15 +11,19 @@
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.loggers.audit :as-alias audit]
[app.loggers.webhooks :as-alias webhooks]
[app.media :as media]
[app.rpc.climit :as-alias climit]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.queries.teams :as teams]
[app.rpc.semaphore :as rsem]
[app.storage :as sto]
[app.util.services :as sv]
[app.util.time :as dt]
[clojure.spec.alpha :as s]
[promesa.core :as p]))
[promesa.core :as p]
[promesa.exec :as px]))
(declare create-font-variant)
@@ -40,21 +44,23 @@
::font-id ::font-family ::font-weight ::font-style]))
(sv/defmethod ::create-font-variant
{::doc/added "1.3"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [team-id profile-id] :as params}]
(let [cfg (update cfg :storage media/configure-assets-storage)]
(teams/check-edition-permissions! pool profile-id team-id)
(create-font-variant cfg params)))
(defn create-font-variant
[{:keys [storage pool executor semaphores] :as cfg} {:keys [data] :as params}]
[{:keys [storage pool executor climit] :as cfg} {:keys [data] :as params}]
(letfn [(generate-fonts [data]
(rsem/with-dispatch (:process-font semaphores)
(climit/with-dispatch (:process-font climit)
(media/run {:cmd :generate-fonts :input data})))
;; Function responsible of calculating cryptographyc hash of
;; the provided data.
(calculate-hash [data]
(rsem/with-dispatch (:process-font semaphores)
(px/with-dispatch executor
(sto/calculate-hash data)))
(validate-data [data]
@@ -103,28 +109,29 @@
:ttf-file-id (:id ttf)}))
]
(-> (generate-fonts data)
(p/then validate-data)
(p/then persist-fonts executor)
(p/then insert-into-db executor))))
(->> (generate-fonts data)
(p/map validate-data)
(p/mcat executor persist-fonts)
(p/map executor insert-into-db)
(p/map (fn [result]
(let [params (update params :data (comp vec keys))]
(rph/with-meta result {::audit/replace-props params})))))))
;; --- UPDATE FONT FAMILY
(s/def ::update-font
(s/keys :req-un [::profile-id ::team-id ::id ::name]))
(def sql:update-font
"update team_font_variant
set font_family = ?
where team_id = ?
and font_id = ?")
(sv/defmethod ::update-font
{::doc/added "1.3"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [team-id profile-id id name] :as params}]
(db/with-atomic [conn pool]
(teams/check-edition-permissions! conn profile-id team-id)
(db/exec-one! conn [sql:update-font name team-id id])
nil))
(db/update! conn :team-font-variant
{:font-family name}
{:font-id id
:team-id team-id})))
;; --- DELETE FONT
@@ -132,10 +139,11 @@
(s/keys :req-un [::profile-id ::team-id ::id]))
(sv/defmethod ::delete-font
{::doc/added "1.3"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id team-id profile-id] :as params}]
(db/with-atomic [conn pool]
(teams/check-edition-permissions! conn profile-id team-id)
(db/update! conn :team-font-variant
{:deleted-at (dt/now)}
{:font-id id :team-id team-id})
@@ -147,7 +155,8 @@
(s/keys :req-un [::profile-id ::team-id ::id]))
(sv/defmethod ::delete-font-variant
{::doc/added "1.3"}
{::doc/added "1.3"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id team-id profile-id] :as params}]
(db/with-atomic [conn pool]
(teams/check-edition-permissions! conn profile-id team-id)

View File

@@ -13,9 +13,10 @@
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.http.client :as http]
[app.media :as media]
[app.rpc.climit :as climit]
[app.rpc.queries.teams :as teams]
[app.rpc.semaphore :as rsem]
[app.storage :as sto]
[app.storage.tmp :as tmp]
[app.util.services :as sv]
@@ -23,7 +24,8 @@
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[datoteka.io :as io]
[promesa.core :as p]))
[promesa.core :as p]
[promesa.exec :as px]))
(def default-max-file-size (* 1024 1024 10)) ; 10 MiB
@@ -97,32 +99,32 @@
;; something fails, all leaked (already created storage objects) will
;; be eventually marked as deleted by the touched-gc task.
;;
;; The touched-gc task, performs periodic analisis of all touched
;; The touched-gc task, performs periodic analysis of all touched
;; storage objects and check references of it. This is the reason why
;; `reference` metadata exists: it indicates the name of the table
;; witch holds the reference to storage object (it some kind of
;; inverse, soft referential integrity).
(defn create-file-media-object
[{:keys [storage pool semaphores] :as cfg}
[{:keys [storage pool climit executor] :as cfg}
{:keys [id file-id is-local name content] :as params}]
(letfn [;; Function responsible to retrieve the file information, as
;; it is synchronous operation it should be wrapped into
;; with-dispatch macro.
(get-info [content]
(rsem/with-dispatch (:process-image semaphores)
(climit/with-dispatch (:process-image climit)
(media/run {:cmd :info :input content})))
;; Function responsible of calculating cryptographyc hash of
;; the provided data.
(calculate-hash [data]
(rsem/with-dispatch (:process-image semaphores)
(px/with-dispatch executor
(sto/calculate-hash data)))
;; Function responsible of generating thumnail. As it is synchronous
;; opetation, it should be wrapped into with-dispatch macro
(generate-thumbnail [info]
(rsem/with-dispatch (:process-image semaphores)
(climit/with-dispatch (:process-image climit)
(media/run (assoc thumbnail-options
:cmd :generic-thumbnail
:input info))))
@@ -154,14 +156,15 @@
:bucket "file-media-object"})))
(insert-into-database [info image thumb]
(db/exec-one! pool [sql:create-file-media-object
(or id (uuid/next))
file-id is-local name
(:id image)
(:id thumb)
(:width info)
(:height info)
(:mtype info)]))]
(px/with-dispatch executor
(db/exec-one! pool [sql:create-file-media-object
(or id (uuid/next))
file-id is-local name
(:id image)
(:id thumb)
(:width info)
(:height info)
(:mtype info)])))]
(p/let [info (get-info content)
thumb (create-thumbnail info)
@@ -184,7 +187,7 @@
(create-file-media-object-from-url cfg params)))
(defn- create-file-media-object-from-url
[{:keys [http-client] :as cfg} {:keys [url name] :as params}]
[cfg {:keys [url name] :as params}]
(letfn [(parse-and-validate-size [headers]
(let [size (some-> (get headers "content-length") d/parse-integer)
mtype (get headers "content-type")
@@ -213,7 +216,7 @@
:format format}))
(download-media [uri]
(-> (http-client {:method :get :uri uri} {:response-type :input-stream})
(-> (http/req! cfg {:method :get :uri uri} {:response-type :input-stream})
(p/then process-response)))
(process-response [{:keys [body headers] :as response}]

View File

@@ -12,13 +12,16 @@
[app.config :as cf]
[app.db :as db]
[app.emails :as eml]
[app.http.session :as session]
[app.loggers.audit :as audit]
[app.media :as media]
[app.rpc :as-alias rpc]
[app.rpc.climit :as-alias climit]
[app.rpc.commands.auth :as cmd.auth]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.mutations.teams :as teams]
[app.rpc.queries.profile :as profile]
[app.rpc.semaphore :as rsem]
[app.storage :as sto]
[app.tokens :as tokens]
[app.util.services :as sv]
@@ -46,6 +49,7 @@
:opt-un [::lang ::theme]))
(sv/defmethod ::update-profile
{::doc/added "1.0"}
[{:keys [pool] :as cfg} {:keys [profile-id fullname lang theme] :as params}]
(db/with-atomic [conn pool]
;; NOTE: we need to retrieve the profile independently if we use
@@ -68,8 +72,11 @@
:props (db/tjson (:props profile))}
{:id profile-id})
(with-meta (-> profile profile/strip-private-attrs d/without-nils)
{::audit/props (audit/profile->props profile)}))))
(-> profile
profile/strip-private-attrs
d/without-nils
(rph/with-meta {::audit/props (audit/profile->props profile)})))))
;; --- MUTATION: Update Password
@@ -81,11 +88,11 @@
(s/keys :req-un [::profile-id ::password ::old-password]))
(sv/defmethod ::update-profile-password
{::rsem/queue :auth}
{::climit/queue :auth}
[{:keys [pool] :as cfg} {:keys [password] :as params}]
(db/with-atomic [conn pool]
(let [profile (validate-password! conn params)
session-id (:app.rpc/session-id params)]
session-id (::rpc/session-id params)]
(when (= (str/lower (:email profile))
(str/lower (:password params)))
(ex/raise :type :validation
@@ -131,7 +138,7 @@
(update-profile-photo cfg params)))
(defn update-profile-photo
[{:keys [pool storage executor] :as cfg} {:keys [profile-id] :as params}]
[{:keys [pool storage executor] :as cfg} {:keys [profile-id file] :as params}]
(p/let [profile (px/with-dispatch executor
(db/get-by-id pool :profile profile-id))
photo (teams/upload-photo cfg params)]
@@ -144,7 +151,13 @@
(db/update! pool :profile
{:photo-id (:id photo)}
{:id profile-id})
nil))
(-> (rph/wrap)
(rph/with-meta {::audit/replace-props
{:file-name (:filename file)
:file-size (:size file)
:file-path (str (:path file))
:file-mtype (:mtype file)}}))))
;; --- MUTATION: Request Email Change
@@ -276,8 +289,7 @@
{:deleted-at deleted-at}
{:id profile-id})
(with-meta {}
{:transform-response (:delete session)}))))
(rph/with-transform {} (session/delete-fn session)))))
(def sql:owned-teams
"with owner_teams as (
@@ -296,77 +308,3 @@
(defn- get-owned-teams-with-participants
[conn profile-id]
(db/exec! conn [sql:owned-teams profile-id profile-id]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; DEPRECATED METHODS (TO BE REMOVED ON 1.16.x)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; --- MUTATION: Login
(s/def ::login ::cmd.auth/login-with-password)
(sv/defmethod ::login
{:auth false
::rsem/queue :auth
::doc/added "1.0"
::doc/deprecated "1.15"}
[cfg params]
(cmd.auth/login-with-password cfg params))
;; --- MUTATION: Logout
(s/def ::logout ::cmd.auth/logout)
(sv/defmethod ::logout
{:auth false
::doc/added "1.0"
::doc/deprecated "1.15"}
[{:keys [session] :as cfg} _]
(with-meta {}
{:transform-response (:delete session)}))
;; --- MUTATION: Recover Profile
(s/def ::recover-profile ::cmd.auth/recover-profile)
(sv/defmethod ::recover-profile
{::doc/added "1.0"
::doc/deprecated "1.15"}
[cfg params]
(cmd.auth/recover-profile cfg params))
;; --- MUTATION: Prepare Register
(s/def ::prepare-register-profile ::cmd.auth/prepare-register-profile)
(sv/defmethod ::prepare-register-profile
{:auth false
::doc/added "1.0"
::doc/deprecated "1.15"}
[cfg params]
(cmd.auth/prepare-register cfg params))
;; --- MUTATION: Register Profile
(s/def ::register-profile ::cmd.auth/register-profile)
(sv/defmethod ::register-profile
{:auth false
::rsem/queue :auth
::doc/added "1.0"
::doc/deprecated "1.15"}
[{:keys [pool] :as cfg} params]
(db/with-atomic [conn pool]
(-> (assoc cfg :conn conn)
(cmd.auth/register-profile params))))
;; --- MUTATION: Request Profile Recovery
(s/def ::request-profile-recovery ::cmd.auth/request-profile-recovery)
(sv/defmethod ::request-profile-recovery
{:auth false
::doc/added "1.0"
::doc/deprecated "1.15"}
[cfg params]
(cmd.auth/request-profile-recovery cfg params))

View File

@@ -9,6 +9,10 @@
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.loggers.audit :as-alias audit]
[app.loggers.webhooks :as-alias webhooks]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.permissions :as perms]
[app.rpc.queries.projects :as proj]
[app.rpc.queries.teams :as teams]
@@ -22,7 +26,6 @@
(s/def ::name ::us/string)
(s/def ::profile-id ::us/uuid)
;; --- Mutation: Create Project
(declare create-project)
@@ -35,6 +38,8 @@
:opt-un [::id]))
(sv/defmethod ::create-project
{::doc/added "1.0"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [profile-id team-id] :as params}]
(db/with-atomic [conn pool]
(teams/check-edition-permissions! conn profile-id team-id)
@@ -122,10 +127,13 @@
;; this is not allowed.
(sv/defmethod ::delete-project
{::doc/added "1.0"
::webhooks/event? true}
[{:keys [pool] :as cfg} {:keys [id profile-id] :as params}]
(db/with-atomic [conn pool]
(proj/check-edition-permissions! conn profile-id id)
(db/update! conn :project
{:deleted-at (dt/now)}
{:id id :is-default false})
nil))
(let [project (db/update! conn :project
{:deleted-at (dt/now)}
{:id id :is-default false})]
(rph/with-meta (rph/wrap)
{::audit/props {:team-id (:team-id project)}}))))

View File

@@ -10,7 +10,7 @@
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.rpc.queries.files :as files]
[app.rpc.commands.files :as files]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))

View File

@@ -16,11 +16,12 @@
[app.emails :as eml]
[app.loggers.audit :as audit]
[app.media :as media]
[app.rpc.climit :as climit]
[app.rpc.helpers :as rph]
[app.rpc.mutations.projects :as projects]
[app.rpc.permissions :as perms]
[app.rpc.queries.profile :as profile]
[app.rpc.queries.teams :as teams]
[app.rpc.semaphore :as rsem]
[app.storage :as sto]
[app.tokens :as tokens]
[app.util.services :as sv]
@@ -315,13 +316,13 @@
(assoc team :photo-id (:id photo))))
(defn upload-photo
[{:keys [storage semaphores] :as cfg} {:keys [file]}]
[{:keys [storage executor climit] :as cfg} {:keys [file]}]
(letfn [(get-info [content]
(rsem/with-dispatch (:process-image semaphores)
(climit/with-dispatch (:process-image climit)
(media/run {:cmd :info :input content})))
(generate-thumbnail [info]
(rsem/with-dispatch (:process-image semaphores)
(climit/with-dispatch (:process-image climit)
(media/run {:cmd :profile-thumbnail
:format :jpeg
:quality 85
@@ -332,7 +333,7 @@
;; Function responsible of calculating cryptographyc hash of
;; the provided data.
(calculate-hash [data]
(rsem/with-dispatch (:process-image semaphores)
(px/with-dispatch executor
(sto/calculate-hash data)))]
(p/let [info (get-info file)
@@ -340,11 +341,10 @@
hash (calculate-hash (:data thumb))
content (-> (sto/content (:data thumb) (:size thumb))
(sto/wrap-with-hash hash))]
(rsem/with-dispatch (:process-image semaphores)
(sto/put-object! storage {::sto/content content
::sto/deduplicate? true
:bucket "profile"
:content-type (:mtype thumb)})))))
(sto/put-object! storage {::sto/content content
::sto/deduplicate? true
:bucket "profile"
:content-type (:mtype thumb)}))))
;; --- Mutation: Invite Member
@@ -474,7 +474,6 @@
[{:keys [pool] :as cfg} {:keys [profile-id emails role] :as params}]
(db/with-atomic [conn pool]
(let [team (create-team conn params)
audit-fn (:audit cfg)
profile (db/get-by-id conn :profile profile-id)]
;; Create invitations for all provided emails.
@@ -487,18 +486,18 @@
:email email
:role role)))
(with-meta team
{::audit/props {:invitations (count emails)}
:before-complete
#(audit-fn :cmd :submit
:type "mutation"
:name "invite-team-member"
:profile-id profile-id
:props {:emails emails
:role role
:profile-id profile-id
:invitations (count emails)})}))))
(-> team
(vary-meta assoc ::audit/props {:invitations (count emails)})
(rph/with-defer
#(when-let [collector (::audit/collector cfg)]
(audit/submit! collector
{:type "mutation"
:name "invite-team-member"
:profile-id profile-id
:props {:emails emails
:role role
:profile-id profile-id
:invitations (count emails)}})))))))
;; --- Mutation: Update invitation role

View File

@@ -8,8 +8,8 @@
(:require
[app.db :as db]
[app.rpc.commands.comments :as cmd.comments]
[app.rpc.commands.files :as cmd.files]
[app.rpc.doc :as-alias doc]
[app.rpc.queries.files :as files]
[app.rpc.queries.teams :as teams]
[app.util.services :as sv]
[clojure.spec.alpha :as s]))
@@ -52,7 +52,7 @@
::doc/deprecated "1.15"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id share-id] :as params}]
(with-open [conn (db/open pool)]
(files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.comments/get-comment-thread conn params)))
;; --- QUERY: Comments
@@ -65,7 +65,7 @@
[{:keys [pool] :as cfg} {:keys [profile-id thread-id share-id] :as params}]
(with-open [conn (db/open pool)]
(let [thread (db/get-by-id conn :comment-thread thread-id)]
(files/check-comment-permissions! conn profile-id (:file-id thread) share-id))
(cmd.files/check-comment-permissions! conn profile-id (:file-id thread) share-id))
(cmd.comments/get-comments conn thread-id)))
@@ -78,5 +78,5 @@
::doc/added "1.13"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id share-id]}]
(with-open [conn (db/open pool)]
(files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.files/check-comment-permissions! conn profile-id file-id share-id)
(cmd.comments/get-file-comments-users conn file-id profile-id)))

View File

@@ -6,283 +6,62 @@
(ns app.rpc.queries.files
(:require
[app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex]
[app.common.geom.shapes :as gsh]
[app.common.pages.helpers :as cph]
[app.common.pages.migrations :as pmg]
[app.common.spec :as us]
[app.common.types.file :as ctf]
[app.common.types.shape-tree :as ctt]
[app.db :as db]
[app.db.sql :as sql]
[app.rpc.helpers :as rpch]
[app.rpc.permissions :as perms]
[app.rpc.commands.files :as cmd.files]
[app.rpc.commands.search :as cmd.search]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.rpc.queries.projects :as projects]
[app.rpc.queries.share-link :refer [retrieve-share-link]]
[app.rpc.queries.teams :as teams]
[app.util.blob :as blob]
[app.util.services :as sv]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]))
(declare decode-row)
;; --- Helpers & Specs
(s/def ::frame-id ::us/uuid)
(s/def ::id ::us/uuid)
(s/def ::name ::us/string)
(s/def ::project-id ::us/uuid)
(s/def ::file-id ::us/uuid)
(s/def ::profile-id ::us/uuid)
(s/def ::team-id ::us/uuid)
(s/def ::search-term ::us/string)
(s/def ::components-v2 ::us/boolean)
;; --- Query: File Permissions
(def ^:private sql:file-permissions
"select fpr.is_owner,
fpr.is_admin,
fpr.can_edit
from file_profile_rel as fpr
where fpr.file_id = ?
and fpr.profile_id = ?
union all
select tpr.is_owner,
tpr.is_admin,
tpr.can_edit
from team_profile_rel as tpr
inner join project as p on (p.team_id = tpr.team_id)
inner join file as f on (p.id = f.project_id)
where f.id = ?
and tpr.profile_id = ?
union all
select ppr.is_owner,
ppr.is_admin,
ppr.can_edit
from project_profile_rel as ppr
inner join file as f on (f.project_id = ppr.project_id)
where f.id = ?
and ppr.profile_id = ?")
(defn retrieve-file-permissions
[conn profile-id file-id]
(when (and profile-id file-id)
(db/exec! conn [sql:file-permissions
file-id profile-id
file-id profile-id
file-id profile-id])))
(defn get-permissions
([conn profile-id file-id]
(let [rows (retrieve-file-permissions conn profile-id file-id)
is-owner (boolean (some :is-owner rows))
is-admin (boolean (some :is-admin rows))
can-edit (boolean (some :can-edit rows))]
(when (seq rows)
{:type :membership
:is-owner is-owner
:is-admin (or is-owner is-admin)
:can-edit (or is-owner is-admin can-edit)
:can-read true
:is-logged (some? profile-id)})))
([conn profile-id file-id share-id]
(let [perms (get-permissions conn profile-id file-id)
ldata (retrieve-share-link conn file-id share-id)]
;; NOTE: in a future when share-link becomes more powerful and
;; will allow us specify which parts of the app is available, we
;; will probably need to tweak this function in order to expose
;; this flags to the frontend.
(cond
(some? perms) perms
(some? ldata) {:type :share-link
:can-read true
:is-logged (some? profile-id)
:who-comment (:who-comment ldata)
:who-inspect (:who-inspect ldata)}))))
(def has-edit-permissions?
(perms/make-edition-predicate-fn get-permissions))
(def has-read-permissions?
(perms/make-read-predicate-fn get-permissions))
(def has-comment-permissions?
(perms/make-comment-predicate-fn get-permissions))
(def check-edition-permissions!
(perms/make-check-fn has-edit-permissions?))
(def check-read-permissions!
(perms/make-check-fn has-read-permissions?))
;; A user has comment permissions if she has read permissions, or comment permissions
(defn check-comment-permissions!
[conn profile-id file-id share-id]
(let [can-read (has-read-permissions? conn profile-id file-id)
can-comment (has-comment-permissions? conn profile-id file-id share-id)]
(when-not (or can-read can-comment)
(ex/raise :type :not-found
:code :object-not-found
:hint "not found"))))
;; --- Query: Files search
;; TODO: this query need to a good refactor
(def ^:private sql:search-files
"with projects as (
select p.*
from project as p
inner join team_profile_rel as tpr on (tpr.team_id = p.team_id)
where tpr.profile_id = ?
and p.team_id = ?
and p.deleted_at is null
and (tpr.is_admin = true or
tpr.is_owner = true or
tpr.can_edit = true)
union
select p.*
from project as p
inner join project_profile_rel as ppr on (ppr.project_id = p.id)
where ppr.profile_id = ?
and p.team_id = ?
and p.deleted_at is null
and (ppr.is_admin = true or
ppr.is_owner = true or
ppr.can_edit = true)
)
select distinct
f.id,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared
from file as f
inner join projects as pr on (f.project_id = pr.id)
where f.name ilike ('%' || ? || '%')
and f.deleted_at is null
order by f.created_at asc")
(s/def ::search-files
(s/keys :req-un [::profile-id ::team-id]
:opt-un [::search-term]))
(sv/defmethod ::search-files
[{:keys [pool] :as cfg} {:keys [profile-id team-id search-term] :as params}]
(when search-term
(db/exec! pool [sql:search-files
profile-id team-id
profile-id team-id
search-term])))
[clojure.spec.alpha :as s]))
;; --- Query: Project Files
(def ^:private sql:project-files
"select f.id,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.revn,
f.is_shared
from file as f
where f.project_id = ?
and f.deleted_at is null
order by f.modified_at desc")
(s/def ::project-files
(s/keys :req-un [::profile-id ::project-id]))
(s/def ::project-files ::cmd.files/get-project-files)
(sv/defmethod ::project-files
{::doc/added "1.1"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id project-id] :as params}]
(with-open [conn (db/open pool)]
(projects/check-read-permissions! conn profile-id project-id)
(db/exec! conn [sql:project-files project-id])))
(cmd.files/get-project-files conn project-id)))
;; --- Query: File (By ID)
(defn retrieve-object-thumbnails
([{:keys [pool]} file-id]
(let [sql (str/concat
"select object_id, data "
" from file_object_thumbnail"
" where file_id=?")]
(->> (db/exec! pool [sql file-id])
(d/index-by :object-id :data))))
([{:keys [pool]} file-id object-ids]
(with-open [conn (db/open pool)]
(let [sql (str/concat
"select object_id, data "
" from file_object_thumbnail"
" where file_id=? and object_id = ANY(?)")
ids (db/create-array conn "text" (seq object-ids))]
(->> (db/exec! conn [sql file-id ids])
(d/index-by :object-id :data))))))
(defn retrieve-file
[{:keys [pool] :as cfg} id components-v2]
(let [file (->> (db/get-by-id pool :file id)
(decode-row)
(pmg/migrate-file))]
(if components-v2
(update file :data ctf/migrate-to-components-v2)
(if (get-in file [:data :options :components-v2])
(ex/raise :type :restriction
:code :feature-disabled
:hint "tried to open a components-v2 file with feature disabled")
file))))
(s/def ::components-v2 ::us/boolean)
(s/def ::file
(s/keys :req-un [::profile-id ::id]
:opt-un [::components-v2]))
(s/and ::cmd.files/get-file
(s/keys :opt-un [::components-v2])))
(defn get-file
[conn id features]
(let [file (cmd.files/get-file conn id features)
thumbs (cmd.files/get-object-thumbnails conn id)]
(assoc file :thumbnails thumbs)))
(sv/defmethod ::file
"Retrieve a file by its ID. Only authenticated users."
[{:keys [pool] :as cfg} {:keys [profile-id id components-v2] :as params}]
(let [perms (get-permissions pool profile-id id)]
(check-read-permissions! perms)
(let [file (retrieve-file cfg id components-v2)
thumbs (retrieve-object-thumbnails cfg id)]
(-> file
(assoc :thumbnails thumbs)
(assoc :permissions perms)))))
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id id features components-v2] :as params}]
(with-open [conn (db/open pool)]
(let [perms (cmd.files/get-permissions pool profile-id id)
;; BACKWARD COMPATIBILTY with the components-v2 parameter
features (cond-> (or features #{})
components-v2 (conj "components/v2"))]
(cmd.files/check-read-permissions! perms)
(-> (get-file conn id features)
(assoc :permissions perms)))))
;; --- QUERY: page
(defn- prune-objects
"Given the page data and the object-id returns the page data with all
other not needed objects removed from the `:objects` data
structure."
[{:keys [objects] :as page} object-id]
(let [objects (cph/get-children-with-self objects object-id)]
(assoc page :objects (d/index-by :id objects))))
(defn- prune-thumbnails
"Given the page data, removes the `:thumbnail` prop from all
shapes."
[page]
(update page :objects d/update-vals #(dissoc % :thumbnail)))
(s/def ::page-id ::us/uuid)
(s/def ::object-id ::us/uuid)
(s/def ::page
(s/and
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::page-id ::object-id ::components-v2])
(fn [obj]
(if (contains? obj :object-id)
(contains? obj :page-id)
true))))
(s/and ::cmd.files/get-page
(s/keys :opt-un [::components-v2])))
(sv/defmethod ::page
"Retrieves the page data from file and returns it. If no page-id is
@@ -294,281 +73,112 @@
mandatory.
Mainly used for rendering purposes."
[{:keys [pool] :as cfg} {:keys [profile-id file-id page-id object-id components-v2] :as props}]
(check-read-permissions! pool profile-id file-id)
(let [file (retrieve-file cfg file-id components-v2)
page-id (or page-id (-> file :data :pages first))
page (get-in file [:data :pages-index page-id])]
{::doc/added "1.5"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id features components-v2] :as params}]
(with-open [conn (db/open pool)]
(cmd.files/check-read-permissions! conn profile-id file-id)
(let [;; BACKWARD COMPATIBILTY with the components-v2 parameter
features (cond-> (or features #{})
components-v2 (conj "components/v2"))
params (assoc params :features features)]
(cond-> (prune-thumbnails page)
(uuid? object-id)
(prune-objects object-id))))
(cmd.files/get-page conn params))))
;; --- QUERY: file-data-for-thumbnail
(defn- get-file-thumbnail-data
[cfg {:keys [data id] :as file}]
(letfn [;; function responsible on finding the frame marked to be
;; used as thumbnail; the returned frame always have
;; the :page-id set to the page that it belongs.
(get-thumbnail-frame [data]
(d/seek :use-for-thumbnail?
(for [page (-> data :pages-index vals)
frame (-> page :objects ctt/get-frames)]
(assoc frame :page-id (:id page)))))
;; function responsible to filter objects data structure of
;; all unneded shapes if a concrete frame is provided. If no
;; frame, the objects is returned untouched.
(filter-objects [objects frame-id]
(d/index-by :id (cph/get-children-with-self objects frame-id)))
;; function responsible of assoc available thumbnails
;; to frames and remove all children shapes from objects if
;; thumbnails is available
(assoc-thumbnails [objects page-id thumbnails]
(loop [objects objects
frames (filter cph/frame-shape? (vals objects))]
(if-let [frame (-> frames first)]
(let [frame-id (:id frame)
object-id (str page-id frame-id)
frame (if-let [thumb (get thumbnails object-id)]
(assoc frame :thumbnail thumb :shapes [])
(dissoc frame :thumbnail))
children-ids
(cph/get-children-ids objects frame-id)
bounds
(when (:show-content frame)
(gsh/selection-rect (concat [frame] (->> children-ids (map (d/getf objects))))))
frame
(cond-> frame
(some? bounds)
(assoc :children-bounds bounds))]
(if (:thumbnail frame)
(recur (-> objects
(assoc frame-id frame)
(d/without-keys children-ids))
(rest frames))
(recur (assoc objects frame-id frame)
(rest frames))))
objects)))]
(let [frame (get-thumbnail-frame data)
frame-id (:id frame)
page-id (or (:page-id frame)
(-> data :pages first))
page (dm/get-in data [:pages-index page-id])
frame-ids (if (some? frame) (list frame-id) (map :id (ctt/get-frames (:objects page))))
obj-ids (map #(str page-id %) frame-ids)
thumbs (retrieve-object-thumbnails cfg id obj-ids)]
(cond-> page
;; If we have frame, we need to specify it on the page level
;; and remove the all other unrelated objects.
(some? frame-id)
(-> (assoc :thumbnail-frame-id frame-id)
(update :objects filter-objects frame-id))
;; Assoc the available thumbnails and prune not visible shapes
;; for avoid transfer unnecesary data.
:always
(update :objects assoc-thumbnails page-id thumbs)))))
(s/def ::file-data-for-thumbnail
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::components-v2]))
(s/and ::cmd.files/get-file-data-for-thumbnail
(s/keys :opt-un [::components-v2])))
(sv/defmethod ::file-data-for-thumbnail
"Retrieves the data for generate the thumbnail of the file. Used
mainly for render thumbnails on dashboard."
[{:keys [pool] :as cfg} {:keys [profile-id file-id components-v2] :as props}]
(check-read-permissions! pool profile-id file-id)
(let [file (retrieve-file cfg file-id components-v2)]
{:file-id file-id
:revn (:revn file)
:page (get-file-thumbnail-data cfg file)}))
{::doc/added "1.11"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id features components-v2] :as props}]
(with-open [conn (db/open pool)]
(cmd.files/check-read-permissions! conn profile-id file-id)
(let [;; BACKWARD COMPATIBILTY with the components-v2 parameter
features (cond-> (or features #{})
components-v2 (conj "components/v2"))
file (cmd.files/get-file conn file-id features)]
{:file-id file-id
:revn (:revn file)
:page (cmd.files/get-file-data-for-thumbnail conn file)})))
;; --- Query: Shared Library Files
(def ^:private sql:team-shared-files
"select f.id,
f.revn,
f.data,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared
from file as f
inner join project as p on (p.id = f.project_id)
where f.is_shared = true
and f.deleted_at is null
and p.deleted_at is null
and p.team_id = ?
order by f.modified_at desc")
(s/def ::team-shared-files
(s/keys :req-un [::profile-id ::team-id]))
(s/def ::team-shared-files ::cmd.files/get-team-shared-files)
(sv/defmethod ::team-shared-files
[{:keys [pool] :as cfg} {:keys [team-id] :as params}]
(let [assets-sample
(fn [assets limit]
(let [sorted-assets (->> (vals assets)
(sort-by #(str/lower (:name %))))]
{:count (count sorted-assets)
:sample (into [] (take limit sorted-assets))}))
library-summary
(fn [data]
{:components (assets-sample (:components data) 4)
:colors (assets-sample (:colors data) 3)
:typographies (assets-sample (:typographies data) 3)})
xform (comp
(map decode-row)
(map #(assoc % :library-summary (library-summary (:data %))))
(map #(dissoc % :data)))]
(into #{} xform (db/exec! pool [sql:team-shared-files team-id]))))
{::doc/added "1.3"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id team-id] :as params}]
(with-open [conn (db/open pool)]
(teams/check-read-permissions! conn profile-id team-id)
(cmd.files/get-team-shared-files conn params)))
;; --- Query: File Libraries used by a File
(def ^:private sql:file-libraries
"WITH RECURSIVE libs AS (
SELECT fl.*, flr.synced_at
FROM file AS fl
JOIN file_library_rel AS flr ON (flr.library_file_id = fl.id)
WHERE flr.file_id = ?::uuid
UNION
SELECT fl.*, flr.synced_at
FROM file AS fl
JOIN file_library_rel AS flr ON (flr.library_file_id = fl.id)
JOIN libs AS l ON (flr.file_id = l.id)
)
SELECT l.id,
l.data,
l.project_id,
l.created_at,
l.modified_at,
l.deleted_at,
l.name,
l.revn,
l.synced_at
FROM libs AS l
WHERE l.deleted_at IS NULL OR l.deleted_at > now();")
(defn retrieve-file-libraries
[{:keys [pool] :as cfg} is-indirect file-id]
(let [xform (comp
(map #(assoc % :is-indirect is-indirect))
(map decode-row))]
(into #{} xform (db/exec! pool [sql:file-libraries file-id]))))
(s/def ::file-libraries
(s/keys :req-un [::profile-id ::file-id]))
(s/def ::file-libraries ::cmd.files/get-file-libraries)
(sv/defmethod ::file-libraries
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(check-read-permissions! pool profile-id file-id)
(retrieve-file-libraries cfg false file-id))
{::doc/added "1.3"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id features] :as params}]
(with-open [conn (db/open pool)]
(cmd.files/check-read-permissions! conn profile-id file-id)
(cmd.files/get-file-libraries conn file-id features)))
;; --- Query: Files that use this File library
(def ^:private sql:library-using-files
"SELECT f.id,
f.name
FROM file_library_rel AS flr
JOIN file AS f ON (f.id = flr.file_id)
WHERE flr.library_file_id = ?
AND (f.deleted_at IS NULL OR f.deleted_at > now())")
(s/def ::library-using-files
(s/keys :req-un [::profile-id ::file-id]))
(s/def ::library-using-files ::cmd.files/get-library-file-references)
(sv/defmethod ::library-using-files
{::doc/added "1.13"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id file-id] :as params}]
(check-read-permissions! pool profile-id file-id)
(db/exec! pool [sql:library-using-files file-id]))
(with-open [conn (db/open pool)]
(cmd.files/check-read-permissions! conn profile-id file-id)
(cmd.files/get-library-file-references conn file-id)))
;; --- QUERY: team-recent-files
(def sql:team-recent-files
"with recent_files as (
select f.id,
f.revn,
f.project_id,
f.created_at,
f.modified_at,
f.name,
f.is_shared,
row_number() over w as row_num
from file as f
join project as p on (p.id = f.project_id)
where p.team_id = ?
and p.deleted_at is null
and f.deleted_at is null
window w as (partition by f.project_id order by f.modified_at desc)
order by f.modified_at desc
)
select * from recent_files where row_num <= 10;")
(s/def ::team-recent-files
(s/keys :req-un [::profile-id ::team-id]))
(s/def ::team-recent-files ::cmd.files/get-team-recent-files)
(sv/defmethod ::team-recent-files
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [profile-id team-id]}]
(teams/check-read-permissions! pool profile-id team-id)
(db/exec! pool [sql:team-recent-files team-id]))
(with-open [conn (db/open pool)]
(teams/check-read-permissions! conn profile-id team-id)
(cmd.files/get-team-recent-files conn team-id)))
;; --- QUERY: get file thumbnail
(s/def ::revn ::us/integer)
(s/def ::file-thumbnail
(s/keys :req-un [::profile-id ::file-id]
:opt-un [::revn]))
(s/def ::file-thumbnail ::cmd.files/get-file-thumbnail)
(sv/defmethod ::file-thumbnail
{::doc/added "1.13"
::doc/deprecated "1.17"}
[{:keys [pool]} {:keys [profile-id file-id revn]}]
(check-read-permissions! pool profile-id file-id)
(let [sql (sql/select :file-thumbnail
(cond-> {:file-id file-id}
revn (assoc :revn revn))
{:limit 1
:order-by [[:revn :desc]]})
(with-open [conn (db/open pool)]
(cmd.files/check-read-permissions! conn profile-id file-id)
(-> (cmd.files/get-file-thumbnail conn file-id revn)
(rph/with-http-cache cmd.files/long-cache-duration))))
row (db/exec-one! pool sql)]
(when-not row
(ex/raise :type :not-found
:code :file-thumbnail-not-found))
;; --- QUERY: search files
(with-meta
{:data (:data row)
:props (some-> (:props row) db/decode-transit-pgobject)
:revn (:revn row)
:file-id (:file-id row)}
{:transform-response (rpch/http-cache {:max-age (* 1000 60 60)})})))
(s/def ::search-files ::cmd.search/search-files)
;; --- Helpers
(defn decode-row
[{:keys [data changes] :as row}]
(when row
(cond-> row
changes (assoc :changes (blob/decode changes))
data (assoc :data (blob/decode data)))))
(sv/defmethod ::search-files
{::doc/added "1.0"
::doc/deprecated "1.17"}
[{:keys [pool]} {:keys [search-term] :as params}]
(when search-term
(cmd.search/search-files pool params)))

View File

@@ -8,7 +8,7 @@
(:require
[app.common.spec :as us]
[app.db :as db]
[app.rpc.queries.files :as files]
[app.rpc.commands.files :as files]
[app.rpc.queries.projects :as projects]
[app.rpc.queries.teams :as teams]
[app.util.services :as sv]

View File

@@ -6,79 +6,26 @@
(ns app.rpc.queries.viewer
(:require
[app.common.exceptions :as ex]
[app.common.spec :as us]
[app.db :as db]
[app.rpc.commands.comments :as comments]
[app.rpc.queries.files :as files]
[app.rpc.queries.share-link :as slnk]
[app.rpc.commands.viewer :as viewer]
[app.rpc.doc :as-alias doc]
[app.util.services :as sv]
[clojure.spec.alpha :as s]
[promesa.core :as p]))
[clojure.spec.alpha :as s]))
;; --- Query: View Only Bundle
(defn- retrieve-project
[pool id]
(db/get-by-id pool :project id {:columns [:id :name :team-id]}))
(defn- retrieve-bundle
[{:keys [pool] :as cfg} file-id profile-id components-v2]
(p/let [file (files/retrieve-file cfg file-id components-v2)
project (retrieve-project pool (:project-id file))
libs (files/retrieve-file-libraries cfg false file-id)
users (comments/get-file-comments-users pool file-id profile-id)
links (->> (db/query pool :share-link {:file-id file-id})
(mapv slnk/decode-share-link-row))
fonts (db/query pool :team-font-variant
{:team-id (:team-id project)
:deleted-at nil})]
{:file file
:users users
:fonts fonts
:project project
:share-links links
:libraries libs}))
(s/def ::file-id ::us/uuid)
(s/def ::profile-id ::us/uuid)
(s/def ::share-id ::us/uuid)
(s/def ::components-v2 ::us/boolean)
(s/def ::view-only-bundle
(s/keys :req-un [::file-id] :opt-un [::profile-id ::share-id ::components-v2]))
(s/and ::viewer/get-view-only-bundle
(s/keys :opt-un [::components-v2])))
(sv/defmethod ::view-only-bundle {:auth false}
[{:keys [pool] :as cfg} {:keys [profile-id file-id share-id components-v2] :as params}]
(p/let [slink (slnk/retrieve-share-link pool file-id share-id)
perms (files/get-permissions pool profile-id file-id share-id)
thumbs (files/retrieve-object-thumbnails cfg file-id)
bundle (p/-> (retrieve-bundle cfg file-id profile-id components-v2)
(assoc :permissions perms)
(assoc-in [:file :thumbnails] thumbs))]
;; When we have neither profile nor share, we just return a not
;; found response to the user.
(when (and (not profile-id)
(not slink))
(ex/raise :type :not-found
:code :object-not-found))
;; When we have only profile, we need to check read permissions
;; on file.
(when (and profile-id (not slink))
(files/check-read-permissions! pool profile-id file-id))
(cond-> bundle
(some? slink)
(assoc :share slink)
(and (some? slink)
(not (contains? (:flags slink) "view-all-pages")))
(update-in [:file :data] (fn [data]
(let [allowed-pages (:pages slink)]
(-> data
(update :pages (fn [pages] (filterv #(contains? allowed-pages %) pages)))
(update :pages-index (fn [index] (select-keys index allowed-pages))))))))))
(sv/defmethod ::view-only-bundle
{:auth false
::doc/added "1.3"
::doc/deprecated "1.17"}
[{:keys [pool] :as cfg} {:keys [features components-v2] :as params}]
(with-open [conn (db/open pool)]
(let [;; BACKWARD COMPATIBILTY with the components-v2 parameter
features (cond-> (or features #{})
components-v2 (conj "components/v2"))
params (assoc params :features features)]
(viewer/get-view-only-bundle conn params))))

View File

@@ -44,7 +44,6 @@
"
(:require
[app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex]
[app.common.logging :as l]
[app.common.spec :as us]
@@ -84,7 +83,7 @@
::rscript/path "app/rpc/rlimit/window.lua"})
(def enabled?
"Allows on runtime completly disable rate limiting."
"Allows on runtime completely disable rate limiting."
(atom true))
(def ^:private window-opts-re
@@ -111,7 +110,7 @@
"m" :minutes
"s" :seconds
"w" :weeks)
::key (dm/str "ratelimit.window." (d/name name))
::key (str "ratelimit.window." (d/name name))
::opts opts})
(ex/raise :type :validation
:code :invalid-window-limit-opts
@@ -132,7 +131,7 @@
::interval interval
::opts opts
::params [(dt/->seconds interval) rate capacity]
::key (dm/str "ratelimit.bucket." (d/name name))})
::key (str "ratelimit.bucket." (d/name name))})
(ex/raise :type :validation
:code :invalid-bucket-limit-opts
:hint (str/ffmt "looks like '%' does not have a valid format" opts)))))
@@ -140,7 +139,7 @@
(defmethod process-limit :bucket
[redis user-id now {:keys [::key ::params ::service ::capacity ::interval ::rate] :as limit}]
(let [script (-> bucket-rate-limit-script
(assoc ::rscript/keys [(dm/str key "." service "." user-id)])
(assoc ::rscript/keys [(str key "." service "." user-id)])
(assoc ::rscript/vals (conj params (dt/->seconds now))))]
(-> (redis/eval! redis script)
(p/then (fn [result]
@@ -165,7 +164,7 @@
(let [ts (dt/truncate now unit)
ttl (dt/diff now (dt/plus ts {unit 1}))
script (-> window-rate-limit-script
(assoc ::rscript/keys [(dm/str key "." service "." user-id "." (dt/format-instant ts))])
(assoc ::rscript/keys [(str key "." service "." user-id "." (dt/format-instant ts))])
(assoc ::rscript/vals [nreq (dt/->seconds ttl)]))]
(-> (redis/eval! redis script)
(p/then (fn [result]
@@ -197,67 +196,65 @@
(filter (complement ::lresult/allowed?))
(first))]
(when (and rejected (contains? cf/flags :warn-rpc-rate-limits))
(when rejected
(l/warn :hint "rejected rate limit"
:user-id (dm/str user-id)
:user-id (str user-id)
:limit-service (-> rejected ::service name)
:limit-name (-> rejected ::name name)
:limit-strategy (-> rejected ::strategy name)))
{:enabled? true
:allowed? (some? rejected)
:allowed? (not (some? rejected))
:headers {"x-rate-limit-remaining" remaining
"x-rate-limit-reset" reset}})))))
(defn- handle-response
[f cfg params rres]
(if (:enabled? rres)
(let [headers {"x-rate-limit-remaining" (:remaining rres)
"x-rate-limit-reset" (:reset rres)}]
(when-not (:allowed? rres)
[f cfg params result]
(if (:enabled? result)
(let [headers (:headers result)]
(when-not (:allowed? result)
(ex/raise :type :rate-limit
:code :request-blocked
:hint "rate limit reached"
::http/headers headers))
(-> (f cfg params)
(p/then (fn [response]
(with-meta response
{::http/headers headers})))))
(vary-meta response update ::http/headers merge headers)))))
(f cfg params)))
(defn wrap
[{:keys [rlimit redis] :as cfg} f mdata]
(let [skey (keyword (::rpc/type cfg) (->> mdata ::sv/spec name))
sname (dm/str (::rpc/type cfg) "." (->> mdata ::sv/spec name))
default-rresp (p/resolved {:enabled? false})]
(if (or (contains? cf/flags :rpc-rate-limit)
(contains? cf/flags :soft-rpc-rate-limit))
(if rlimit
(let [skey (keyword (::rpc/type cfg) (->> mdata ::sv/spec name))
sname (str (::rpc/type cfg) "." (->> mdata ::sv/spec name))]
(fn [cfg {:keys [::http/request] :as params}]
(let [user-id (or (:profile-id params)
(some-> request parse-client-ip)
uuid/zero)
(let [uid (or (:profile-id params)
(some-> request parse-client-ip)
uuid/zero)
rresp (when (and user-id @enabled?)
(when-let [limits (get-in @rlimit [::limits skey])]
(let [redis (redis/get-or-connect redis ::rlimit default-options)
limits (map #(assoc % ::service sname) limits)
rresp (-> (process-limits redis user-id limits (dt/now))
(p/catch (fn [cause]
;; If we have an error on processing the
;; rate-limit we just skip it for do not cause
;; service interruption because of redis downtime
;; or similar situation.
(l/error :hint "error on processing rate-limit" :cause cause)
{:enabled? false})))]
rsp (when (and uid @enabled?)
(when-let [limits (or (get-in @rlimit [::limits skey])
(get-in @rlimit [::limits :default]))]
(let [redis (redis/get-or-connect redis ::rlimit default-options)
limits (map #(assoc % ::service sname) limits)
resp (-> (process-limits redis uid limits (dt/now))
(p/catch (fn [cause]
;; If we have an error on processing the rate-limit we just skip
;; it for do not cause service interruption because of redis
;; downtime or similar situation.
(l/error :hint "error on processing rate-limit" :cause cause)
{:enabled? false})))]
;; If soft rate are enabled, we process the rate-limit but return
;; unprotected response.
(and (contains? cf/flags :soft-rpc-rate-limit) rresp))))]
;; If soft rate are enabled, we process the rate-limit but return unprotected
;; response.
(if (contains? cf/flags :soft-rpc-rlimit)
(p/resolved {:enabled? false})
resp))))
(p/then (or rresp default-rresp)
(partial handle-response f cfg params))))
f)))
rsp (or rsp (p/resolved {:enabled? false}))]
(p/then rsp (partial handle-response f cfg params)))))
f))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; CONFIG WATCHER
@@ -335,7 +332,7 @@
::limits limits}))))
(defn- refresh-config
[{:keys [state path executor scheduler] :as params}]
[{:keys [state path executor scheduled-executor] :as params}]
(letfn [(update-config [{:keys [::updated-at] :as state}]
(let [updated-at' (fs/last-modified-time path)]
(merge state
@@ -350,7 +347,7 @@
state)))))
(schedule-next [state]
(px/schedule! scheduler
(px/schedule! scheduled-executor
(inst-ms (::refresh state))
(partial refresh-config params))
state)]
@@ -361,7 +358,7 @@
(defn- on-refresh-error
[_ cause]
(when-not (instance? java.util.concurrent.RejectedExecutionException cause)
(if-let [explain (-> cause ex-data us/pretty-explain)]
(if-let [explain (-> cause ex-data ex/explain)]
(l/warn ::l/raw (str "unable to refresh config, invalid format:\n" explain)
::l/async false)
(l/warn :hint "unexpected exception on loading config"
@@ -374,22 +371,22 @@
(and (fs/exists? path) (fs/regular-file? path) path)))
(defmethod ig/pre-init-spec :app.rpc/rlimit [_]
(s/keys :req-un [::wrk/executor ::wrk/scheduler]))
(s/keys :req-un [::wrk/executor ::wrk/scheduled-executor]))
(defmethod ig/init-key :app.rpc/rlimit
(defmethod ig/init-key ::rpc/rlimit
[_ {:keys [executor] :as params}]
(let [state (agent {})]
(when (contains? cf/flags :rpc-rlimit)
(let [state (agent {})]
(set-error-handler! state on-refresh-error)
(set-error-mode! state :continue)
(set-error-handler! state on-refresh-error)
(set-error-mode! state :continue)
(when-let [path (get-config-path)]
(l/info :hint "initializing rlimit config reader" :path (str path))
(when-let [path (get-config-path)]
(l/info :hint "initializing rlimit config reader" :path (str path))
;; Initialize the state with initial refresh value
(send-via executor state (constantly {::refresh (dt/duration "5s")}))
;; Initialize the state with initial refresh value
(send-via executor state (constantly {::refresh (dt/duration "5s")}))
;; Force a refresh
(refresh-config (assoc params :path path :state state)))
;; Force a refresh
(refresh-config (assoc params :path path :state state)))
state))
state)))

View File

@@ -1,149 +0,0 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.semaphore
"Resource usage limits (in other words: semaphores)."
(:require
[app.common.data :as d]
[app.common.logging :as l]
[app.common.spec :as us]
[app.config :as cf]
[app.metrics :as mtx]
[app.rpc :as-alias rpc]
[app.util.locks :as locks]
[app.util.time :as ts]
[app.worker :as-alias wrk]
[clojure.spec.alpha :as s]
[integrant.core :as ig]
[promesa.core :as p]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; ASYNC SEMAPHORE IMPL
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defprotocol IAsyncSemaphore
(acquire! [_])
(release! [_ tp]))
(defn create
[& {:keys [permits metrics name executor]}]
(let [used (volatile! 0)
queue (volatile! (d/queue))
labels (into-array String [(d/name name)])
lock (locks/create)
permits (or permits Long/MAX_VALUE)]
(when (>= permits Long/MAX_VALUE)
(l/warn :hint "permits value too hight" :permits permits :semaphore name))
^{::wrk/executor executor
::name name}
(reify IAsyncSemaphore
(acquire! [_]
(let [d (p/deferred)]
(locks/locking lock
(if (< @used permits)
(do
(vswap! used inc)
(p/resolve! d))
(vswap! queue conj d)))
(mtx/run! metrics
:id :semaphore-used-permits
:val @used
:labels labels)
(mtx/run! metrics
:id :semaphore-queued-submissions
:val (count @queue)
:labels labels)
d))
(release! [_ tp]
(locks/locking lock
(if-let [item (peek @queue)]
(do
(vswap! queue pop)
(p/resolve! item))
(when (pos? @used)
(vswap! used dec))))
(mtx/run! metrics
:id :semaphore-timing
:val (inst-ms (tp))
:labels labels)
(mtx/run! metrics
:id :semaphore-used-permits
:val @used
:labels labels)
(mtx/run! metrics
:id :semaphore-queued-submissions
:val (count @queue)
:labels labels)))))
(defn semaphore?
[v]
(satisfies? IAsyncSemaphore v))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PREDEFINED SEMAPHORES
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(s/def ::semaphore semaphore?)
(s/def ::semaphores
(s/map-of ::us/keyword ::semaphore))
(defmethod ig/pre-init-spec ::rpc/semaphores [_]
(s/keys :req-un [::mtx/metrics]))
(defn- create-default-semaphores
[metrics executor]
[(create :permits (cf/get :semaphore-process-font)
:metrics metrics
:name :process-font
:executor executor)
(create :permits (cf/get :semaphore-update-file)
:metrics metrics
:name :update-file
:executor executor)
(create :permits (cf/get :semaphore-process-image)
:metrics metrics
:name :process-image
:executor executor)
(create :permits (cf/get :semaphore-auth)
:metrics metrics
:name :auth
:executor executor)])
(defmethod ig/init-key ::rpc/semaphores
[_ {:keys [metrics executor]}]
(->> (create-default-semaphores metrics executor)
(d/index-by (comp ::name meta))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PUBLIC API
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmacro with-dispatch
[queue & body]
`(let [tpoint# (ts/tpoint)
queue# ~queue
executor# (-> queue# meta ::wrk/executor)]
(-> (acquire! queue#)
(p/then (fn [_#] ~@body) executor#)
(p/finally (fn [_# _#]
(release! queue# tpoint#))))))
(defn wrap
[{:keys [semaphores]} f {:keys [::queue]}]
(let [queue' (get semaphores queue)]
(if (semaphore? queue')
(fn [cfg params]
(with-dispatch queue'
(f cfg params)))
(do
(when (some? queue)
(l/warn :hint "undefined semaphore" :name queue))
f))))

View File

@@ -8,8 +8,10 @@
"Initial data setup of instance."
(:require
[app.common.logging :as l]
[app.common.spec :as us]
[app.common.uuid :as uuid]
[app.db :as db]
[app.main :as-alias main]
[app.setup.builtin-templates]
[app.setup.keys :as keys]
[buddy.core.codecs :as bc]
@@ -48,6 +50,9 @@
:cause cause))))
instance-id)))
(s/def ::main/props
(s/map-of ::us/keyword some?))
(defmethod ig/pre-init-spec ::props [_]
(s/keys :req-un [::db/pool]))
@@ -57,7 +62,7 @@
(db/xact-lock! conn 0)
(when-not key
(l/warn :hint (str "using autogenerated secret-key, it will change on each restart and will invalidate "
"all sessions on each restart, it is hightly recommeded setting up the "
"all sessions on each restart, it is hightly recommended setting up the "
"PENPOT_SECRET_KEY environment variable")))
(let [secret (or key (generate-random-key))]

View File

@@ -5,7 +5,7 @@
;; Copyright (c) KALEIDOS INC
(ns app.setup.builtin-templates
"A service/module that is reponsible for download, load & internally
"A service/module that is responsible for download, load & internally
expose a set of builtin penpot file templates."
(:require
[app.common.logging :as l]
@@ -29,10 +29,8 @@
(s/keys :req-un [::id ::name ::thumbnail-uri ::file-uri]
:opt-un [::path]))
(s/def ::http-client ::http/client)
(defmethod ig/pre-init-spec :app.setup/builtin-templates [_]
(s/keys :req-un [::http-client]))
(s/keys :req [::http/client]))
(defmethod ig/init-key :app.setup/builtin-templates
[_ cfg]
@@ -43,7 +41,7 @@
(defn- download-preset!
[cfg {:keys [path file-uri] :as preset}]
(let [response (http/req! (:http-client cfg)
(let [response (http/req! cfg
{:method :get
:uri file-uri}
{:response-type :input-stream

View File

@@ -30,6 +30,8 @@
[cuerdas.core :as str]
[expound.alpha :as expound]))
(def ^:dynamic *conn*)
(defn reset-password!
"Reset a password to a specific one for a concrete user or all users
if email is `:all` keyword."
@@ -66,17 +68,21 @@
(db/with-atomic [conn (:app.db/pool system)]
(let [file (db/get-by-id conn :file id {:for-update true})
file (-> file
(update :features db/decode-pgarray #{})
(update :data blob/decode)
(cond-> migrate? (update :data pmg/migrate-data))
(update :data update-fn)
(update :data blob/encode)
(cond-> inc-revn? (update :revn inc)))]
(cond-> migrate? (update :data pmg/migrate-data)))
file (binding [*conn* conn]
(-> (update-fn file)
(cond-> inc-revn? (update :revn inc))))]
(when save?
(db/update! conn :file
{:data (:data file)
:revn (:revn file)}
{:id (:id file)}))
(update file :data blob/decode))))
(let [features (db/create-array conn "text" (:features file))
data (blob/encode (:data file))]
(db/update! conn :file
{:data data
:revn (:revn file)
:features features}
{:id id})))
file)))
(def ^:private sql:retrieve-files-chunk
"SELECT id, name, created_at, data FROM file
@@ -122,27 +128,12 @@
(on-end state)
state))))))
(defn analyze-file-data
[system & {:keys [id on-form on-data]}]
(let [file (get-file system id)]
(cond
(fn? on-data)
(on-data (:data file))
(fn? on-form)
(walk/postwalk (fn [form]
(on-form form)
form)
(:data file)))
nil))
(defn update-pages
"Apply a function to all pages of one file. The function receives a page and returns an updated page."
[data f]
(update data :pages-index d/update-vals f))
(update data :pages-index update-vals f))
(defn update-shapes
"Apply a function to all shapes of one page The function receives a shape and returns an updated shape"
[page f]
(update page :objects d/update-vals f))
(update page :objects update-vals f))

View File

@@ -16,7 +16,11 @@
[app.rpc.queries.profile :as profile]
[app.srepl.fixes :as f]
[app.srepl.helpers :as h]
[app.util.blob :as blob]
[app.util.objects-map :as omap]
[app.util.pointer-map :as pmap]
[app.util.time :as dt]
[app.worker :as wrk]
[clojure.pprint :refer [pprint]]
[cuerdas.core :as str]))
@@ -34,6 +38,16 @@
(task-fn params)
(println (format "no task '%s' found" name))))))
(defn schedule-task!
([system name]
(schedule-task! system name {}))
([system name props]
(let [pool (:app.db/pool system)]
(wrk/submit!
::wrk/conn pool
::wrk/task name
::wrk/props props))))
(defn send-test-email!
[system destination]
(us/verify!
@@ -62,31 +76,18 @@
(cmd.auth/send-email-verification! pool sprops profile)
:email-sent))
(defn update-profile!
"Update a limited set of profile attrs."
[system & {:keys [email id active? deleted? blocked?]}]
(us/verify!
:expr (some? system)
:hint "system should be provided")
(us/verify!
:expr (or (string? email) (uuid? id))
:hint "email or id should be provided")
(let [params (cond-> {}
(true? active?) (assoc :is-active true)
(false? active?) (assoc :is-active false)
(true? deleted?) (assoc :deleted-at (dt/now))
(true? blocked?) (assoc :is-blocked true)
(false? blocked?) (assoc :is-blocked false))
opts (cond-> {}
(some? email) (assoc :email (str/lower email))
(some? id) (assoc :id id))]
(db/with-atomic [conn (:app.db/pool system)]
(some-> (db/update! conn :profile params opts)
(profile/decode-profile-row)))))
(defn mark-profile-as-active!
"Mark the profile blocked and removes all the http sessiones
associated with the profile-id."
[system email]
(db/with-atomic [conn (:app.db/pool system)]
(when-let [profile (db/get-by-params conn :profile
{:email (str/lower email)}
{:columns [:id :email]
:check-not-found false})]
(when-not (:is-blocked profile)
(db/update! conn :profile {:is-active true} {:id (:id profile)})
:activated))))
(defn mark-profile-as-blocked!
"Mark the profile blocked and removes all the http sessiones
@@ -101,3 +102,54 @@
(db/update! conn :profile {:is-blocked true} {:id (:id profile)})
(db/delete! conn :http-session {:profile-id (:id profile)})
:blocked))))
(defn enable-objects-map-feature-on-file!
[system & {:keys [save? id]}]
(letfn [(update-file [{:keys [features] :as file}]
(if (contains? features "storage/objects-map")
file
(-> file
(update :data migrate-to-omap)
(update :features conj "storage/objects-map"))))
(migrate-to-omap [data]
(-> data
(update :pages-index update-vals #(update % :objects omap/wrap))
(update :components update-vals #(update % :objects omap/wrap))))]
(h/update-file! system
:id id
:update-fn update-file
:save? save?)))
(defn enable-pointer-map-feature-on-file!
[system & {:keys [save? id]}]
(letfn [(update-file [{:keys [features id] :as file}]
(if (contains? features "storage/pointer-map")
file
(-> file
(update :data migrate-to-omap id)
(update :features conj "storage/pointer-map"))))
(migrate-to-omap [data file-id]
(binding [pmap/*tracked* (atom {})]
(let [data (-> data
(update :pages-index update-vals pmap/wrap)
(update :components pmap/wrap))]
(doseq [[id item] @pmap/*tracked*]
(db/insert! h/*conn* :file-data-fragment
{:id id
:file-id file-id
:content (-> item deref blob/encode)}))
data)))]
(h/update-file! system
:id id
:update-fn update-file
:save? save?)))
(defn enable-storage-features-on-file!
[system & {:as params}]
(enable-objects-map-feature-on-file! system params)
(enable-pointer-map-feature-on-file! system params))

View File

@@ -371,7 +371,7 @@
(db/create-array conn "uuid" ids)]))
;; NOTE: A getter that retrieves the key witch will be used
;; for group ids; previoulsy we have no value, then we
;; for group ids; previously we have no value, then we
;; introduced the `:reference` prop, and then it is renamed
;; to `:bucket` and now is string instead. This is
;; implemented in this way for backward comaptibilty.

View File

@@ -75,8 +75,10 @@
(defmethod impl/get-object-bytes :fs
[backend object]
(p/let [input (impl/get-object-data backend object)]
(ex/with-always (io/close! input)
(io/read-as-bytes input))))
(try
(io/read-as-bytes input)
(finally
(io/close! input)))))
(defmethod impl/get-object-url :fs
[{:keys [uri executor] :as backend} {:keys [id] :as object} _]

View File

@@ -6,12 +6,13 @@
(ns app.storage.tmp
"Temporal files service all created files will be tried to clean after
1 hour afrer creation. This is a best effort, if this process fails,
1 hour after creation. This is a best effort, if this process fails,
the operating system cleaning task should be responsible of
permanently delete these files (look at systemd-tempfiles)."
(:require
[app.common.data :as d]
[app.common.logging :as l]
[app.storage :as-alias sto]
[app.util.time :as dt]
[app.worker :as wrk]
[clojure.core.async :as a]
@@ -23,43 +24,43 @@
(declare remove-temp-file)
(defonce queue (a/chan 128))
(s/def ::min-age ::dt/duration)
(defmethod ig/pre-init-spec ::cleaner [_]
(s/keys :req-un [::min-age ::wrk/scheduler ::wrk/executor]))
(s/keys :req [::sto/min-age ::wrk/scheduled-executor]))
(defmethod ig/prep-key ::cleaner
[_ cfg]
(merge {:min-age (dt/duration {:minutes 30})}
(merge {::sto/min-age (dt/duration "30m")}
(d/without-nils cfg)))
(defmethod ig/init-key ::cleaner
[_ {:keys [scheduler executor min-age] :as cfg}]
(l/info :hint "starting tempfile cleaner service")
(let [cch (a/chan)]
(a/go-loop []
(let [[path port] (a/alts! [queue cch])]
(when (not= port cch)
[_ {:keys [::sto/min-age ::wrk/scheduled-executor] :as cfg}]
(px/thread
{:name "penpot/storage-tmp-cleaner"}
(try
(l/info :hint "started tmp file cleaner")
(loop []
(when-let [path (a/<!! queue)]
(l/trace :hint "schedule tempfile deletion" :path path
:expires-at (dt/plus (dt/now) min-age))
(px/schedule! scheduler
(px/schedule! scheduled-executor
(inst-ms min-age)
(partial remove-temp-file executor path))
(recur))))
cch))
(partial remove-temp-file path))
(recur)))
(catch InterruptedException _
(l/debug :hint "interrupted"))
(finally
(l/info :hint "terminated tmp file cleaner")))))
(defmethod ig/halt-key! ::cleaner
[_ close-ch]
(l/info :hint "stoping tempfile cleaner service")
(some-> close-ch a/close!))
[_ thread]
(px/interrupt! thread))
(defn- remove-temp-file
"Permanently delete tempfile"
[executor path]
(px/with-dispatch executor
(l/trace :hint "permanently delete tempfile" :path path)
(when (fs/exists? path)
(fs/delete path))))
[path]
(l/trace :hint "permanently delete tempfile" :path path)
(when (fs/exists? path)
(fs/delete path)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; API

View File

@@ -13,14 +13,16 @@
[app.common.data :as d]
[app.common.logging :as l]
[app.common.pages.migrations :as pmg]
[app.common.types.file :as ctf]
[app.common.types.shape-tree :as ctt]
[app.config :as cf]
[app.db :as db]
[app.rpc.commands.files :as files]
[app.util.blob :as blob]
[app.util.pointer-map :as pmap]
[app.util.time :as dt]
[clojure.set :as set]
[clojure.spec.alpha :as s]
[cuerdas.core :as str]
[integrant.core :as ig]))
(declare ^:private retrieve-candidates)
@@ -42,10 +44,10 @@
(defmethod ig/init-key ::handler
[_ {:keys [pool] :as cfg}]
(fn [params]
(fn [{:keys [file-id] :as params}]
(db/with-atomic [conn pool]
(let [min-age (or (:min-age params) (:min-age cfg))
cfg (assoc cfg :min-age min-age :conn conn)]
cfg (assoc cfg :min-age min-age :conn conn :file-id file-id)]
(loop [total 0
files (retrieve-candidates cfg)]
(if-let [file (first files)]
@@ -54,7 +56,7 @@
(recur (inc total)
(rest files)))
(do
(l/info :hint "task finished" :min-age (dt/format-duration min-age) :total total)
(l/info :hint "task finished" :min-age (dt/format-duration min-age) :processed total)
;; Allow optional rollback passed by params
(when (:rollback? params)
@@ -71,6 +73,7 @@
"select f.id,
f.data,
f.revn,
f.features,
f.modified_at
from file as f
where f.has_media_trimmed is false
@@ -81,21 +84,26 @@
for update skip locked")
(defn- retrieve-candidates
[{:keys [conn min-age id] :as cfg}]
(if id
[{:keys [conn min-age file-id] :as cfg}]
(if (uuid? file-id)
(do
(l/warn :hint "explicit file id passed on params" :id id)
(db/query conn :file {:id id}))
(l/warn :hint "explicit file id passed on params" :file-id file-id)
(->> (db/query conn :file {:id file-id})
(map #(update % :features db/decode-pgarray #{}))))
(let [interval (db/interval min-age)
get-chunk (fn [cursor]
(let [rows (db/exec! conn [sql:retrieve-candidates-chunk interval cursor])]
[(some->> rows peek :modified-at) (seq rows)]))]
[(some->> rows peek :modified-at)
(map #(update % :features db/decode-pgarray #{}) rows)]))]
(d/iteration get-chunk
:vf second
:kf first
:initk (dt/now)))))
(defn collect-used-media
"Analyzes the file data and collects all references to external
assets. Returns a set of ids."
[data]
(let [xform (comp
(map :objects)
@@ -151,9 +159,8 @@
unused (set/difference stored using)]
(when (seq unused)
(let [sql (str/concat
"delete from file_object_thumbnail "
" where file_id=? and object_id=ANY(?)")
(let [sql (str "delete from file_object_thumbnail "
" where file_id=? and object_id=ANY(?)")
res (db/exec-one! conn [sql file-id (db/create-array conn "text" unused)])]
(l/debug :hint "delete file object thumbnails" :file-id file-id :total (:next.jdbc/update-count res))))))
@@ -162,22 +169,111 @@
(let [sql (str "delete from file_thumbnail "
" where file_id=? and revn < ?")
res (db/exec-one! conn [sql file-id revn])]
(l/debug :hint "delete file thumbnails" :file-id file-id :total (:next.jdbc/update-count res))))
(when-not (zero? (:next.jdbc/update-count res))
(l/debug :hint "delete file thumbnails" :file-id file-id :total (:next.jdbc/update-count res)))))
(def ^:private
sql:retrieve-client-files
"select f.data, f.modified_at
from file as f
left join file_library_rel as fl on (fl.file_id = f.id)
where fl.library_file_id = ?
and f.modified_at < ?
and f.deleted_at is null
order by f.modified_at desc
limit 1")
(defn- retrieve-client-files
"search al files that use the given library.
Returns a sequence of file-data (only reads database rows one by one)."
[conn library-id]
(let [get-chunk (fn [cursor]
(let [rows (db/exec! conn [sql:retrieve-client-files library-id cursor])]
[(some-> rows peek :modified-at)
(map (comp blob/decode :data) rows)]))]
(d/iteration get-chunk
:vf second
:kf first
:initk (dt/now))))
(defn- clean-deleted-components!
"Performs the garbage collection of unreferenced deleted components."
[conn library-id library-data]
(let [find-used-components-file
(fn [components file-data]
; Find which of the components are used in the file.
(into #{}
(filter #(ctf/used-in? file-data library-id % :component))
components))
find-used-components
(fn [components files-data]
; Find what components are used in any of the files.
(loop [files-data files-data
components components
used-components #{}]
(let [file-data (first files-data)]
(if (or (nil? file-data) (empty? components))
used-components
(let [used-components-file (find-used-components-file components file-data)]
(recur (rest files-data)
(into #{} (remove used-components-file) components)
(into used-components used-components-file)))))))
deleted-components (set (vals (:deleted-components library-data)))
saved-components (find-used-components deleted-components
(cons library-data
(retrieve-client-files conn library-id)))
new-deleted-components (d/index-by :id (vec saved-components))
total (- (count deleted-components)
(count saved-components))]
(when-not (zero? total)
(l/debug :hint "clean deleted components" :total total)
(let [new-data (-> library-data
(assoc :deleted-components new-deleted-components)
(blob/encode))]
(db/update! conn :file
{:data new-data}
{:id library-id})))))
(def ^:private sql:get-unused-fragments
"SELECT id FROM file_data_fragment
WHERE file_id = ? AND id != ALL(?::uuid[])")
(defn- clean-data-fragments!
[conn file-id data]
(let [used (->> (concat (vals data)
(vals (:pages-index data)))
(into #{} (comp (filter pmap/pointer-map?)
(map pmap/get-id)))
(db/create-array conn "uuid"))
rows (db/exec! conn [sql:get-unused-fragments file-id used])]
(doseq [fragment-id (map :id rows)]
(l/trace :hint "remove unused file data fragment" :id (str fragment-id))
(db/delete! conn :file-data-fragment {:id fragment-id :file-id file-id}))))
(defn- process-file
[{:keys [conn] :as cfg} {:keys [id data revn modified-at] :as file}]
[{:keys [conn] :as cfg} {:keys [id data revn modified-at features] :as file}]
(l/debug :hint "processing file" :id id :modified-at modified-at)
(let [data (-> (blob/decode data)
(assoc :id id)
(pmg/migrate-data))]
(binding [pmap/*load-fn* (partial files/load-pointer conn id)]
(let [data (-> (blob/decode data)
(assoc :id id)
(pmg/migrate-data))]
(clean-file-media! conn id data)
(clean-file-frame-thumbnails! conn id data)
(clean-file-thumbnails! conn id revn)
(clean-file-media! conn id data)
(clean-file-frame-thumbnails! conn id data)
(clean-file-thumbnails! conn id revn)
(clean-deleted-components! conn id data)
;; Mark file as trimmed
(db/update! conn :file
(when (contains? features "storage/pointer-map")
(clean-data-fragments! conn id data))
;; Mark file as trimmed
(db/update! conn :file
{:has-media-trimmed true}
{:id id})
nil))
nil)))

View File

@@ -158,7 +158,7 @@
(recur (rest tables)
(+ total (process-table (assoc cfg :table table))))
(do
(l/info :hint "objects gc finished succesfully"
(l/info :hint "objects gc finished successfully"
:min-age (dt/format-duration min-age)
:total total)

View File

@@ -11,13 +11,14 @@
(:require
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.spec :as us]
[app.config :as cf]
[app.db :as db]
[app.util.async :refer [thread-sleep]]
[app.http.client :as http]
[app.main :as-alias main]
[app.util.json :as json]
[clojure.spec.alpha :as s]
[integrant.core :as ig]))
[integrant.core :as ig]
[promesa.exec :as px]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; TASK ENTRY POINT
@@ -28,18 +29,13 @@
(declare get-subscriptions-newsletter-updates)
(declare get-subscriptions-newsletter-news)
(s/def ::http-client fn?)
(s/def ::version ::us/string)
(s/def ::uri ::us/string)
(s/def ::instance-id ::us/uuid)
(s/def ::sprops
(s/keys :req-un [::instance-id]))
(defmethod ig/pre-init-spec ::handler [_]
(s/keys :req-un [::db/pool ::http-client ::version ::uri ::sprops]))
(s/keys :req [::http/client
::db/pool
::main/props]))
(defmethod ig/init-key ::handler
[_ {:keys [pool sprops version] :as cfg}]
[_ {:keys [::db/pool ::main/props] :as cfg}]
(fn [{:keys [send? enabled?] :or {send? true enabled? false}}]
(let [subs {:newsletter-updates (get-subscriptions-newsletter-updates pool)
:newsletter-news (get-subscriptions-newsletter-news pool)}
@@ -48,15 +44,15 @@
(cf/get :telemetry-enabled))
data {:subscriptions subs
:version version
:instance-id (:instance-id sprops)}]
:version (:full cf/version)
:instance-id (:instance-id props)}]
(cond
;; If we have telemetry enabled, then proceed the normal
;; operation.
enabled?
(let [data (merge data (get-stats pool))]
(when send?
(thread-sleep (rand-int 10000))
(px/sleep (rand-int 10000))
(send! cfg data))
data)
@@ -68,7 +64,7 @@
(seq subs)
(do
(when send?
(thread-sleep (rand-int 10000))
(px/sleep (rand-int 10000))
(send! cfg data))
data)
@@ -80,12 +76,13 @@
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- send!
[{:keys [http-client uri] :as cfg} data]
(let [response (http-client {:method :post
:uri uri
:headers {"content-type" "application/json"}
:body (json/write-str data)}
{:sync? true})]
[cfg data]
(let [response (http/req! cfg
{:method :post
:uri (cf/get :telemetry-uri)
:headers {"content-type" "application/json"}
:body (json/encode-str data)}
{:sync? true})]
(when (> (:status response) 206)
(ex/raise :type :internal
:code :invalid-response

View File

@@ -41,7 +41,7 @@
(defn encode
([data] (encode data nil))
([data {:keys [version]}]
(let [version (or version (cf/get :default-blob-version 4))]
(let [version (or version (cf/get :default-blob-version 5))]
(case (long version)
1 (encode-v1 data)
3 (encode-v3 data)

View File

@@ -1,31 +0,0 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.util.closeable
"A closeable abstraction. A drop in replacement for
clojure builtin `with-open` syntax abstraction."
(:refer-clojure :exclude [with-open]))
(defprotocol ICloseable
(-close [_] "Close the resource."))
(defmacro with-open
[bindings & body]
{:pre [(vector? bindings)
(even? (count bindings))
(pos? (count bindings))]}
(reduce (fn [acc bindings]
`(let ~(vec bindings)
(try
~acc
(finally
(-close ~(first bindings))))))
`(do ~@body)
(reverse (partition 2 bindings))))
(extend-protocol ICloseable
java.lang.AutoCloseable
(-close [this] (.close this)))

View File

@@ -6,6 +6,7 @@
(ns app.util.fressian
(:require
[app.common.data :as d]
[app.common.geom.matrix :as gmt]
[app.common.geom.point :as gpt]
[clojure.data.fressian :as fres])
@@ -17,14 +18,13 @@
java.io.ByteArrayOutputStream
java.time.Instant
java.time.OffsetDateTime
java.util.List
org.fressian.Reader
org.fressian.StreamingWriter
org.fressian.Writer
org.fressian.handlers.ReadHandler
org.fressian.handlers.WriteHandler))
;; --- MISC
(set! *warn-on-reflection* true)
(defn str->bytes
@@ -33,231 +33,250 @@
([^String s, ^String encoding]
(.getBytes s encoding)))
;; --- LOW LEVEL FRESSIAN API
(defn write-object!
([^Writer w ^Object o]
(.writeObject w o))
([^Writer w ^Object o ^Boolean cache?]
(.writeObject w o cache?)))
(defn read-object!
[^Reader r]
(.readObject r))
(defn write-tag!
([^Writer w ^String n]
(.writeTag w n 1))
([^Writer w ^String n ^long ni]
(.writeTag w n ni)))
(defn write-bytes!
[^Writer w ^bytes data]
(.writeBytes w data))
(defn write-int!
[^Writer w ^long val]
(.writeInt w val))
(defn write-list!
[^Writer w ^List val]
(.writeList w val))
;; --- READ AND WRITE HANDLERS
(defn read-symbol
[r]
(symbol (read-object! r)
(read-object! r)))
(defn read-keyword
[r]
(keyword (read-object! r)
(read-object! r)))
(defn write-named
[tag ^Writer w s]
(.writeTag w tag 2)
(.writeObject w (namespace s) true)
(.writeObject w (name s) true))
(write-tag! w tag 2)
(write-object! w (namespace s) true)
(write-object! w (name s) true))
(defn write-list-like
([^Writer w tag o]
(.writeTag w tag 1)
(.writeList w o)))
[tag ^Writer w o]
(write-tag! w tag 1)
(write-list! w o))
(defn read-list-like
[^Reader rdr build-fn]
(build-fn (.readObject rdr)))
(defn begin-closed-list!
[^StreamingWriter w]
(.beginClosedList w))
(defn end-list!
[^StreamingWriter w]
(.endList w))
(defn write-map-like
"Writes a map as Fressian with the tag 'map' and all keys cached."
[^Writer w tag m]
(.writeTag w tag 1)
(.beginClosedList ^StreamingWriter w)
[tag ^Writer w m]
(write-tag! w tag 1)
(begin-closed-list! w)
(loop [items (seq m)]
(when-let [^clojure.lang.MapEntry item (first items)]
(.writeObject w (.key item) true)
(.writeObject w (.val item))
(write-object! w (.key item) true)
(write-object! w (.val item))
(recur (rest items))))
(.endList ^StreamingWriter w))
(end-list! w))
(defn read-map-like
[^Reader rdr]
(let [kvs ^java.util.List (.readObject rdr)]
(let [kvs ^java.util.List (read-object! rdr)]
(if (< (.size kvs) 16)
(clojure.lang.PersistentArrayMap. (.toArray kvs))
(clojure.lang.PersistentHashMap/create (seq kvs)))))
(def write-handlers
{ Character
{"char"
(reify WriteHandler
(write [_ w ch]
(.writeTag w "char" 1)
(.writeInt w (int ch))))}
(def ^:dynamic *write-handler-lookup* nil)
(def ^:dynamic *read-handler-lookup* nil)
app.common.geom.point.Point
{"penpot/point"
(reify WriteHandler
(write [_ w o]
(.writeTag ^Writer w "penpot/point" 1)
(.writeList ^Writer w (java.util.List/of (.-x ^Point o) (.-y ^Point o)))))}
(def write-handlers (atom {}))
(def read-handlers (atom {}))
app.common.geom.matrix.Matrix
{"penpot/matrix"
(reify WriteHandler
(write [_ w o]
(.writeTag ^Writer w "penpot/matrix" 1)
(.writeList ^Writer w (java.util.List/of (.-a ^Matrix o)
(.-b ^Matrix o)
(.-c ^Matrix o)
(.-d ^Matrix o)
(.-e ^Matrix o)
(.-f ^Matrix o)))))}
(defn add-handlers!
[& handlers]
(letfn [(adapt-write-handler [{:keys [name class wfn]}]
[class {name (reify WriteHandler
(write [_ w o]
(wfn name w o)))}])
Instant
{"java/instant"
(reify WriteHandler
(write [_ w ch]
(.writeTag w "java/instant" 1)
(.writeInt w (.toEpochMilli ^Instant ch))))}
(adapt-read-handler [{:keys [name rfn]}]
[name (reify ReadHandler
(read [_ rdr _ _]
(rfn rdr)))])
OffsetDateTime
{"java/instant"
(reify WriteHandler
(write [_ w ch]
(.writeTag w "java/instant" 1)
(.writeInt w (.toEpochMilli ^Instant (.toInstant ^OffsetDateTime ch)))))}
(merge-and-clean [m1 m2]
(-> (merge m1 m2)
(d/without-nils)))]
Ratio
{"ratio"
(reify WriteHandler
(write [_ w n]
(.writeTag w "ratio" 2)
(.writeObject w (.numerator ^Ratio n))
(.writeObject w (.denominator ^Ratio n))))}
(let [whs (into {}
(comp
(filter :wfn)
(map adapt-write-handler))
handlers)
rhs (into {}
(comp
(filter :rfn)
(map adapt-read-handler))
handlers)
cwh (swap! write-handlers merge-and-clean whs)
crh (swap! read-handlers merge-and-clean rhs)]
clojure.lang.IPersistentMap
{"clj/map"
(reify WriteHandler
(write [_ w d]
(write-map-like w "clj/map" d)))}
(alter-var-root #'*write-handler-lookup* (constantly (-> cwh fres/associative-lookup fres/inheritance-lookup)))
(alter-var-root #'*read-handler-lookup* (constantly (-> crh fres/associative-lookup)))
nil)))
clojure.lang.Keyword
{"clj/keyword"
(reify WriteHandler
(write [_ w s]
(write-named "clj/keyword" w s)))}
(defn write-char
[n w o]
(write-tag! w n 1)
(write-int! w o))
clojure.lang.BigInt
{"bigint"
(reify WriteHandler
(write [_ w d]
(let [^BigInteger bi (if (instance? clojure.lang.BigInt d)
(.toBigInteger ^clojure.lang.BigInt d)
d)]
(.writeTag w "bigint" 1)
(.writeBytes w (.toByteArray bi)))))}
(defn read-char
[rdr]
(char (read-object! rdr)))
;; Persistent set
clojure.lang.IPersistentSet
{"clj/set"
(reify WriteHandler
(write [_ w o]
(write-list-like w "clj/set" o)))}
(defn write-instant
[n w o]
(write-tag! w n 1)
(write-int! w (.toEpochMilli ^Instant o)))
;; Persistent vector
clojure.lang.IPersistentVector
{"clj/vector"
(reify WriteHandler
(write [_ w o]
(write-list-like w "clj/vector" o)))}
(defn write-offset-date-time
[n w o]
(write-tag! w n 1)
(write-int! w (.toEpochMilli ^Instant (.toInstant ^OffsetDateTime o))))
;; Persistent list
clojure.lang.IPersistentList
{"clj/list"
(reify WriteHandler
(write [_ w o]
(write-list-like w "clj/list" o)))}
(defn read-instant
[rdr]
(Instant/ofEpochMilli (.readInt ^Reader rdr)))
;; Persistent seq & lazy seqs
clojure.lang.ISeq
{"clj/seq"
(reify WriteHandler
(write [_ w o]
(write-list-like w "clj/seq" o)))}
})
(defn write-ratio
[n w o]
(write-tag! w n 2)
(write-object! w (.numerator ^Ratio o))
(write-object! w (.denominator ^Ratio o)))
(defn read-ratio
[rdr]
(Ratio. (biginteger (read-object! rdr))
(biginteger (read-object! rdr))))
(def read-handlers
{"bigint"
(reify ReadHandler
(read [_ rdr _ _]
(let [^bytes bibytes (.readObject rdr)]
(bigint (BigInteger. bibytes)))))
(defn write-bigint
[n w o]
(let [^BigInteger bi (if (instance? clojure.lang.BigInt o)
(.toBigInteger ^clojure.lang.BigInt o)
o)]
(write-tag! w n 1)
(write-bytes! w (.toByteArray bi))))
"byte"
(reify ReadHandler
(read [_ rdr _ _]
(byte (.readObject rdr))))
(defn read-bigint
[rdr]
(let [^bytes bibytes (read-object! rdr)]
(bigint (BigInteger. bibytes))))
"penpot/matrix"
(reify ReadHandler
(read [_ rdr _ _]
(let [^java.util.List x (.readObject rdr)]
(Matrix. (.get x 0) (.get x 1) (.get x 2) (.get x 3) (.get x 4) (.get x 5)))))
(add-handlers!
{:name "char"
:class Character
:wfn write-char
:rfn read-char}
"penpot/point"
(reify ReadHandler
(read [_ rdr _ _]
(let [^java.util.List x (.readObject rdr)]
(Point. (.get x 0) (.get x 1)))))
{:name "java/instant"
:class Instant
:wfn write-instant
:rfn read-instant}
"char"
(reify ReadHandler
(read [_ rdr _ _]
(char (.readObject rdr))))
{:name "java/instant"
:class OffsetDateTime
:wfn write-offset-date-time
:rfn read-instant}
"java/instant"
(reify ReadHandler
(read [_ rdr _ _]
(Instant/ofEpochMilli (.readInt rdr))))
;; LEGACY
{:name "ratio"
:rfn read-ratio}
{:name "clj/ratio"
:class Ratio
:wfn write-ratio
:rfn read-ratio}
"ratio"
(reify ReadHandler
(read [_ rdr _ _]
(Ratio. (biginteger (.readObject rdr))
(biginteger (.readObject rdr)))))
{:name "clj/map"
:class clojure.lang.IPersistentMap
:wfn write-map-like
:rfn read-map-like}
"clj/keyword"
(reify ReadHandler
(read [_ rdr _ _]
(keyword (.readObject rdr) (.readObject rdr))))
{:name "clj/keyword"
:class clojure.lang.Keyword
:wfn write-named
:rfn read-keyword}
"clj/map"
(reify ReadHandler
(read [_ rdr _ _]
(read-map-like rdr)))
{:name "clj/symbol"
:class clojure.lang.Symbol
:wfn write-named
:rfn read-symbol}
"clj/set"
(reify ReadHandler
(read [_ rdr _ _]
(read-list-like rdr set)))
;; LEGACY
{:name "bigint"
:rfn read-bigint}
"clj/vector"
(reify ReadHandler
(read [_ rdr _ _]
(read-list-like rdr vec)))
{:name "clj/bigint"
:class clojure.lang.BigInt
:wfn write-bigint
:rfn read-bigint}
"clj/list"
(reify ReadHandler
(read [_ rdr _ _]
(read-list-like rdr #(apply list %))))
{:name "clj/set"
:class clojure.lang.IPersistentSet
:wfn write-list-like
:rfn (comp set read-object!)}
"clj/seq"
(reify ReadHandler
(read [_ rdr _ _]
(read-list-like rdr sequence)))
})
{:name "clj/vector"
:class clojure.lang.IPersistentVector
:wfn write-list-like
:rfn (comp vec read-object!)}
(def write-handler-lookup
(-> write-handlers
fres/associative-lookup
fres/inheritance-lookup))
{:name "clj/list"
:class clojure.lang.IPersistentList
:wfn write-list-like
:rfn #(apply list (read-object! %))}
(def read-handler-lookup
(-> read-handlers
(fres/associative-lookup)))
{:name "clj/seq"
:class clojure.lang.ISeq
:wfn write-list-like
:rfn (comp sequence read-object!)})
;; --- Low-Level Api
;; --- PUBLIC API
(defn reader
[istream]
(fres/create-reader istream :handlers read-handler-lookup))
(fres/create-reader istream :handlers *read-handler-lookup*))
(defn writer
[ostream]
(fres/create-writer ostream :handlers write-handler-lookup))
(fres/create-writer ostream :handlers *write-handler-lookup*))
(defn read!
[reader]
@@ -267,15 +286,40 @@
[writer data]
(fres/write-object writer data))
;; --- High-Level Api
(defn encode
[data]
(with-open [out (ByteArrayOutputStream.)]
(write! (writer out) data)
(.toByteArray out)))
(with-open [^ByteArrayOutputStream output (ByteArrayOutputStream.)]
(-> (writer output)
(write! data))
(.toByteArray output)))
(defn decode
[data]
(with-open [input (ByteArrayInputStream. ^bytes data)]
(read! (reader input))))
(with-open [^ByteArrayInputStream input (ByteArrayInputStream. ^bytes data)]
(-> input reader read!)))
;; --- ADDITIONAL
(add-handlers!
{:name "penpot/point"
:class app.common.geom.point.Point
:wfn (fn [n w ^Point o]
(write-tag! w n 1)
(write-list! w (List/of (.-x o) (.-y o))))
:rfn (fn [^Reader rdr]
(let [^List x (read-object! rdr)]
(Point. (.get x 0) (.get x 1))))}
{:name "penpot/matrix"
:class app.common.geom.matrix.Matrix
:wfn (fn [^String n ^Writer w o]
(write-tag! w n 1)
(write-list! w (List/of (.-a ^Matrix o)
(.-b ^Matrix o)
(.-c ^Matrix o)
(.-d ^Matrix o)
(.-e ^Matrix o)
(.-f ^Matrix o))))
:rfn (fn [^Reader rdr]
(let [^List x (read-object! rdr)]
(Matrix. (.get x 0) (.get x 1) (.get x 2) (.get x 3) (.get x 4) (.get x 5))))})

Some files were not shown because too many files have changed in this diff Show More