Compare commits

...

174 Commits

Author SHA1 Message Date
Maverick Studer
a441909408 Merge branch 'RE-10691-fp' into 'master'
RED-10691: 500 when approving file in a dossier where dossier owner lost manager role

Closes RE-10691

See merge request redactmanager/persistence-service!930
2025-02-04 09:37:04 +01:00
maverickstuder
a7bdbb5495 RED-10691: 500 when approving file in a dossier where dossier owner lost manager role 2025-01-31 13:35:31 +01:00
Dominique Eifländer
3c0b9d36d4 Merge branch 'OPS-284' into 'master'
OPS-284: add prometheus endpoint

Closes OPS-284

See merge request redactmanager/persistence-service!928
2025-01-24 11:01:07 +01:00
Christoph Schabert
9ff3035730 OPS-284: add prometheus endpoint 2025-01-23 13:34:26 +01:00
Maverick Studer
c933793721 Merge branch 'RED-10731' into 'master'
RED-10731: Zip Bombing (~10.000 files in one zip within file limit) should be detected and aborted

Closes RED-10731

See merge request redactmanager/persistence-service!926
2025-01-21 12:21:11 +01:00
maverickstuder
4746d50627 RED-10731: Zip Bombing (~10.000 files in one zip within file limit) should be detected and aborted 2025-01-21 11:53:57 +01:00
Dominique Eifländer
fdfdba550e Merge branch 'RED-10715-masterA' into 'master'
RED-10715: Remove Saas migration fix

Closes RED-10715

See merge request redactmanager/persistence-service!925
2025-01-17 10:01:29 +01:00
Dominique Eifländer
09d53b7743 RED-10715: Remove Saas migration fix 2025-01-17 09:35:07 +01:00
Maverick Studer
d691164af6 Merge branch 'RED-10728' into 'master'
RED-10728: Endpoint to execute full OCR on specific file

Closes RED-10728

See merge request redactmanager/persistence-service!922
2025-01-16 15:54:23 +01:00
maverickstuder
43f9db59d4 RED-10728: Endpoint to execute full OCR on specific file 2025-01-16 14:38:17 +01:00
Dominique Eifländer
7669d27e7b Merge branch 'RED-10715-master' into 'master'
RED-10715: Remove Saas migration

Closes RED-10715

See merge request redactmanager/persistence-service!921
2025-01-15 13:46:38 +01:00
Dominique Eifländer
d7d46d5429 RED-10715: Remove Saas migration 2025-01-15 13:29:29 +01:00
Maverick Studer
205f9c678e Merge branch 'cleanup-truncated-indices' into 'master'
Migration fixes for 4.3 -> 4.4

See merge request redactmanager/persistence-service!918
2025-01-14 10:42:32 +01:00
Maverick Studer
cf5e235fc9 Migration fixes for 4.3 -> 4.4 2025-01-14 10:42:32 +01:00
Dominique Eifländer
25eb72ee05 Merge branch 'RED-10660' into 'master'
Migration fixes & RED-10660

Closes RED-10660

See merge request redactmanager/persistence-service!916
2025-01-13 09:07:52 +01:00
Maverick Studer
770a489f7f Migration fixes & RED-10660 2025-01-13 09:07:52 +01:00
Dominique Eifländer
18c7875844 Merge branch 'RED-10709-master' into 'master'
RED-10709: MARK_RAN for failed preCondictions in liquibase

Closes RED-10709

See merge request redactmanager/persistence-service!915
2025-01-09 14:08:05 +01:00
Dominique Eifländer
0838f30a24 RED-10709: MARK_RAN for failed preCondictions in liquibase 2025-01-09 13:53:41 +01:00
Kilian Schüttler
15e7417766 Merge branch 'hotfix-index-fp' into 'master'
hotfix: fix index preconditions

See merge request redactmanager/persistence-service!910
2025-01-09 11:31:28 +01:00
Kilian Schuettler
ab112e85c1 hotfix: fix index preconditions 2025-01-07 14:43:21 +01:00
Maverick Studer
2e3833f55d Merge branch 'RED-10669' into 'master'
RED-10669: Unassigned user can alter component values inside file viewer

Closes RED-10669

See merge request redactmanager/persistence-service!909
2025-01-07 14:23:04 +01:00
Maverick Studer
16364671d0 RED-10669: Unassigned user can alter component values inside file viewer 2025-01-07 14:23:03 +01:00
Kevin Tumma
8054806cd3 Merge branch 'hotfix-index-fp' into 'master'
hotfix: precondition check for index creation

See merge request redactmanager/persistence-service!908
2024-12-19 11:52:17 +01:00
Kilian Schuettler
d4396fe6d5 hotfix: precondition check for index creation 2024-12-19 11:31:10 +01:00
Maverick Studer
6d0354946a Merge branch 'RED-10681' into 'master'
RED-10681: Improve tracing to include metadata ids from RequestBody

Closes RED-10681

See merge request redactmanager/persistence-service!905
2024-12-13 14:50:09 +01:00
Maverick Studer
00b4ad800d RED-10681: Improve tracing to include metadata ids from RequestBody 2024-12-13 14:50:09 +01:00
Maverick Studer
34eb724170 Merge branch 'feature/RED-9998' into 'master'
RED-9998: App version history (for conditional re-analyzing the layout of a file)

Closes RED-9998

See merge request redactmanager/persistence-service!901
2024-12-12 09:58:42 +01:00
Maverick Studer
1919b1b306 RED-9998: App version history (for conditional re-analyzing the layout of a file) 2024-12-12 09:58:42 +01:00
Corina Olariu
7dae6d81c5 Merge branch 'RED-10628-fp' into 'master'
RED-10628 - Cloning dossier template after removing and editing component...

Closes RED-10628

See merge request redactmanager/persistence-service!902
2024-12-11 09:58:40 +01:00
corinaolariu
2b27e39234 RED-10628 - Cloning dossier template after removing and editing component definitions causes chain of issues
- when cloning a dossier template clone only the component definitions which are not soft deleted
- unit test added
2024-12-10 20:06:04 +02:00
Yannik Hampe
9718f8d3fd Merge branch 'feature/red-9393' into 'master'
RED-9393 user stats controller

See merge request redactmanager/persistence-service!880
2024-12-06 16:45:16 +01:00
yhampe
1a5ae41001 Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393 2024-12-06 10:04:35 +01:00
yhampe
8ed5f3388b RED-9393 user stats controller 2024-12-06 10:04:20 +01:00
yhampe
77483b6bd0 RED-9393 user stats controller 2024-12-06 10:04:19 +01:00
yhampe
c32c2cdab0 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 10:04:19 +01:00
yhampe
fbb8a7b519 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:04:02 +01:00
yhampe
61e557712b RED-9393 user stats controller 2024-12-06 10:04:00 +01:00
yhampe
c28076df68 RED-9393 user stats controller 2024-12-06 10:03:59 +01:00
yhampe
032f6f87c9 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 10:03:59 +01:00
yhampe
cfac2bcc6b RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:03:59 +01:00
yhampe
c420bda820 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 10:03:59 +01:00
yhampe
9fc3aef669 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 10:03:59 +01:00
yhampe
f55ebd9ecc RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:03:57 +01:00
yhampe
a3a1ee67fc RED-9393 user stats controller 2024-12-06 10:03:48 +01:00
yhampe
945e402639 RED-9393 user stats controller 2024-12-06 10:03:48 +01:00
yhampe
3624a6e49a RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 10:03:48 +01:00
yhampe
850e85ffdb RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:03:48 +01:00
yhampe
6b885f3212 RED-9393 user stats controller 2024-12-06 10:03:48 +01:00
yhampe
87ba79905c RED-9393 user stats controller 2024-12-06 10:03:48 +01:00
yhampe
ae8aecc005 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 10:03:47 +01:00
yhampe
45c0c3d902 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:03:47 +01:00
yhampe
3610e6c76f RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 10:03:47 +01:00
yhampe
7d07b1c882 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 10:03:47 +01:00
yhampe
abfc9eed95 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 10:03:40 +01:00
yhampe
d3e0d8f52c Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393
# Conflicts:
#	persistence-service-v1/persistence-service-processor-v1/src/main/java/com/iqser/red/service/persistence/management/v1/processor/roles/ApplicationRoles.java
2024-12-06 09:34:27 +01:00
yhampe
1967468fec RED-9393 user stats controller 2024-12-06 09:33:39 +01:00
yhampe
569c24924a RED-9393 user stats controller 2024-12-06 09:33:39 +01:00
yhampe
091a648a82 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 09:33:39 +01:00
yhampe
7a087764c1 RED-9393 user stats controller
added action roles
2024-12-06 09:33:01 +01:00
yhampe
5e60074d2f RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:50 +01:00
yhampe
635925b3fc RED-9393 user stats controller 2024-12-06 09:32:48 +01:00
yhampe
fad8fb3af2 RED-9393 user stats controller 2024-12-06 09:32:48 +01:00
yhampe
b3b547914b RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 09:32:48 +01:00
yhampe
556e6a4f6b RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:48 +01:00
yhampe
14143c9356 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:48 +01:00
yhampe
34680a3972 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:48 +01:00
yhampe
6068c39c33 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:46 +01:00
yhampe
e0717e3466 RED-9393 user stats controller 2024-12-06 09:32:39 +01:00
yhampe
bde5c88471 RED-9393 user stats controller 2024-12-06 09:32:38 +01:00
yhampe
84c1d037c6 RED-9393 user stats controller 2024-12-06 09:32:38 +01:00
yhampe
dba47d0f0f RED-9393 user stats controller 2024-12-06 09:32:38 +01:00
yhampe
298ccff842 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 09:32:38 +01:00
yhampe
ab7f6a1470 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
bc1b6b9e6d RED-9393 user stats controller 2024-12-06 09:32:38 +01:00
yhampe
5425e06399 RED-9393 user stats controller 2024-12-06 09:32:38 +01:00
yhampe
5d0b26aca6 RED-9393 user stats controller
added filter for hard deleted files
2024-12-06 09:32:38 +01:00
yhampe
1d595eb1f0 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-06 09:32:38 +01:00
yhampe
a7db55cb13 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
d2fdba5658 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
d1883fc5b6 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
a875f94ca4 RED-9393 user stats controller
added authority check
2024-12-06 09:32:38 +01:00
yhampe
df65dac4cb RED-9393 user stats controller
added action roles
2024-12-06 09:32:38 +01:00
yhampe
9667144c9b RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
8548dcaf66 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:38 +01:00
yhampe
52c5dbea1f RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-06 09:32:38 +01:00
Corina Olariu
65ccf32346 Merge branch 'feature/RED-10342' into 'master'
RED-10342 - File attributes in CSV export

Closes RED-10342

See merge request redactmanager/persistence-service!893
2024-12-06 09:29:56 +01:00
corinaolariu
fac80bbc5c Merge branch 'master' into feature/RED-10342
# Conflicts:
#	persistence-service-v1/persistence-service-processor-v1/src/main/java/com/iqser/red/service/persistence/management/v1/processor/service/persistence/FileAttributeConfigPersistenceService.java
#	persistence-service-v1/persistence-service-server-v1/src/test/java/com/iqser/red/service/peristence/v1/server/integration/tests/FileAttributeTest.java
2024-12-06 10:04:35 +02:00
Maverick Studer
ad20597434 Merge branch 'RED-5997' into 'master'
RED-5997: dossierTemplateId property null in fileAttributes- & dossierAttributes-config

Closes RED-5997

See merge request redactmanager/persistence-service!900
2024-12-06 08:10:42 +01:00
maverickstuder
66809bb136 RED-5997: dossierTemplateId property null in fileAttributes- & dossierAttributes-config 2024-12-05 15:40:50 +01:00
Maverick Studer
c7a74fed78 Merge branch 'RED-9056' into 'master'
RED-9056: Change flag name and decline requests except dossierDictionaryOnly is true

Closes RED-9056

See merge request redactmanager/persistence-service!899
2024-12-05 15:03:45 +01:00
Maverick Studer
8a5e97b9ce RED-9056: Change flag name and decline requests except dossierDictionaryOnly is true 2024-12-05 15:03:45 +01:00
Maverick Studer
08671583fe Merge branch 'RED-10615' into 'master'
RED-10615: Full Analysis loop in DM

Closes RED-10615

See merge request redactmanager/persistence-service!898
2024-12-05 11:54:56 +01:00
Corina Olariu
e7f3b47bb6 Merge branch 'feature/RED-10543' into 'master'
RED-10343 - User should be able to set own placerholder value

Closes RED-10543

See merge request redactmanager/persistence-service!897
2024-12-05 11:51:12 +01:00
maverickstuder
00b47d21dc RED-10615: Full Analysis loop in DM 2024-12-05 11:35:15 +01:00
corinaolariu
ef86d1909d RED-10342 - File attributes in CSV export
- update the dv changelog
2024-12-05 12:15:20 +02:00
corinaolariu
793444b96e RED-10342 - File attributes in CSV export
- renaumber the yaml script
2024-12-05 11:30:52 +02:00
corinaolariu
11d97683d0 RED-10342 - File attributes in CSV export
- merge master
2024-12-05 10:59:58 +02:00
corinaolariu
e3eff19de4 Merge branch 'master' into feature/RED-10342
# Conflicts:
#	persistence-service-v1/persistence-service-processor-v1/src/main/resources/db/changelog/db.changelog-tenant.yaml
2024-12-05 10:58:10 +02:00
corinaolariu
c65a62e9ac RED-10343 - User should be able to set own placerholder value
- remove old function
2024-12-05 10:31:37 +02:00
yhampe
867f0a9c02 Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393 2024-12-05 09:17:49 +01:00
yhampe
d55a72e57a RED-9393 user stats controller 2024-12-05 09:17:39 +01:00
yhampe
f3fc2e2ce2 RED-9393 user stats controller 2024-12-05 09:17:39 +01:00
yhampe
3507130e64 RED-9393 user stats controller 2024-12-05 09:17:39 +01:00
yhampe
6a5792adf6 RED-9393 user stats controller 2024-12-05 09:17:39 +01:00
yhampe
d0c79c87cf RED-9393 user stats controller
removed filter for soft deleted files
2024-12-05 09:17:39 +01:00
yhampe
c95a5f027c RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-05 09:17:05 +01:00
yhampe
4f289c359f RED-9393 user stats controller 2024-12-05 09:16:48 +01:00
yhampe
4efc1b897a RED-9393 user stats controller 2024-12-05 09:16:48 +01:00
yhampe
0584172bbd RED-9393 user stats controller
added filter for hard deleted files
2024-12-05 09:16:48 +01:00
yhampe
295839c048 RED-9393 user stats controller
removed filter for soft deleted files
2024-12-05 09:16:48 +01:00
yhampe
5767e3465e RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-05 09:16:48 +01:00
yhampe
6f6095990f RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-05 09:16:48 +01:00
yhampe
f3032becf4 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-05 09:16:48 +01:00
yhampe
96a8575cb6 RED-9393 user stats controller
added authority check
2024-12-05 09:16:48 +01:00
yhampe
86ff048c6b RED-9393 user stats controller
added action roles
2024-12-05 09:16:48 +01:00
yhampe
0d3e3051ab RED-9393 user stats controller
added filter for soft deleted dossiers
2024-12-05 09:16:48 +01:00
yhampe
0d7b57dd6a RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-05 09:16:48 +01:00
yhampe
15f05624ca RED-9393 user stats controller
added filter for hard deleted dossiers
2024-12-05 09:16:48 +01:00
corinaolariu
47052745c7 Merge branch 'master' into feature/RED-10543
# Conflicts:
#	persistence-service-v1/persistence-service-processor-v1/src/main/java/com/iqser/red/service/persistence/management/v1/processor/service/persistence/DossierAttributeConfigPersistenceService.java
2024-12-05 09:39:29 +02:00
corinaolariu
83668117da RED-10342 - File attributes in CSV export
- add includeToCsvExport field to FileAttributeDefinition
2024-12-05 09:27:49 +02:00
Dominique Eifländer
63c72bc613 Merge branch 'RED-9844' into 'master'
RED-9844: Added endpoint for get components with fileIds

Closes RED-9844

See merge request redactmanager/persistence-service!896
2024-12-04 12:06:12 +01:00
Dominique Eifländer
4453eab3bf RED-9844: Added endpoint for get components with fileIds 2024-12-04 11:45:20 +01:00
Maverick Studer
e700f8b785 Merge branch 'feature/RED-10135' into 'master'
RED-10135: Dossier attributes have fields displayed_in_file_list and filterable

Closes RED-10135

See merge request redactmanager/persistence-service!785
2024-12-03 09:30:25 +01:00
Maverick Studer
4adebda2ab RED-10135: Dossier attributes have fields displayed_in_file_list and filterable 2024-12-03 09:30:25 +01:00
Dominique Eifländer
bdaac65afe Merge branch 'RED-10526' into 'master'
RED-10526: Set liquibase to 4.29.2 as 4.30.0 is 3 times slower

Closes RED-10526

See merge request redactmanager/persistence-service!895
2024-12-02 10:07:34 +01:00
Dominique Eifländer
9956348a06 RED-10526: Set liquibase to 4.29.2 as 4.30.0 is 3 times slower 2024-12-02 09:48:57 +01:00
corinaolariu
0e08794271 RED-10343 - User should be able to set own placerholder value
- when placeholder is set, the placeholder is checked to be unique (conflict otherwise) and then used
- when placeholder is not set, the current behaviour is in place(with generated placeholder)
- update unit tests
2024-11-28 21:30:15 +02:00
yhampe
10f69631b0 RED-9393 user stats controller 2024-11-28 20:16:28 +01:00
Maverick Studer
d3f0d1bc87 Merge branch 'feature/RED-10347' into 'master'
RED-10347: Last download time field for approved files

Closes RED-10347

See merge request redactmanager/persistence-service!823
2024-11-28 10:02:59 +01:00
Maverick Studer
f2c41d5191 RED-10347: Last download time field for approved files 2024-11-28 10:02:58 +01:00
yhampe
28c97b446c RED-9393 user stats controller 2024-11-28 08:48:07 +01:00
corinaolariu
b461f95638 RED-10342 - File attributes in CSV export
- add includeToCsvExport field to FileAttributeConfig
- update unit test
2024-11-27 15:19:26 +02:00
yhampe
969004b542 Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393 2024-11-27 12:12:45 +01:00
yhampe
ce27ac8d17 RED-9393 user stats controller 2024-11-27 12:12:21 +01:00
yhampe
b04bad6057 RED-9393 user stats controller 2024-11-27 12:12:21 +01:00
yhampe
5f98b16bc1 RED-9393 user stats controller
added filter for hard deleted files
2024-11-27 12:12:21 +01:00
yhampe
71f4a78a16 RED-9393 user stats controller
removed filter for soft deleted files
2024-11-27 12:12:21 +01:00
yhampe
f9a5b5aa01 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-11-27 12:12:20 +01:00
yhampe
efbfd26363 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-27 12:11:56 +01:00
yhampe
46cab2786a RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-27 12:11:51 +01:00
yhampe
17b90b1b67 RED-9393 user stats controller
added authority check
2024-11-27 12:11:16 +01:00
yhampe
59933f4a88 RED-9393 user stats controller
added action roles
2024-11-27 12:11:16 +01:00
yhampe
d36cf3c7f2 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-11-27 12:11:15 +01:00
yhampe
861f1e559f RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-27 12:11:15 +01:00
yhampe
9b74db96ba RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-27 12:11:15 +01:00
yhampe
684dc3418d RED-9393 user stats controller 2024-11-27 12:10:43 +01:00
yhampe
b3bc7bb0ac RED-9393 user stats controller 2024-11-27 11:40:10 +01:00
yhampe
03d4f04b15 RED-9393 user stats controller
added filter for hard deleted files
2024-11-26 15:22:50 +01:00
Dominique Eifländer
95f1ea4a00 Merge branch 'RED-10526' into 'master'
Resolve RED-10526

Closes RED-10526

See merge request redactmanager/persistence-service!892
2024-11-26 15:21:10 +01:00
yhampe
5bbcfdffc0 RED-9393 user stats controller
removed filter for soft deleted files
2024-11-26 15:19:37 +01:00
Dominique Eifländer
5409b432d6 RED-10526: Upgrade liquibase to 4.30.0 2024-11-26 14:32:06 +01:00
Dominique Eifländer
292c92c827 RED-10526: Upgrade liquibase to 4.30.0 2024-11-26 10:51:01 +01:00
Maverick Studer
9ce067bd80 Merge branch 'RED-10529' into 'master'
RED-10529: Primary attribute removed after changing encoding type

Closes RED-10529

See merge request redactmanager/persistence-service!890
2024-11-26 09:41:43 +01:00
maverickstuder
5210b8ec40 RED-10529: Primary attribute removed after changing encoding type 2024-11-25 17:23:35 +01:00
Maverick Studer
15ca9ade53 Merge branch 'feature/RED-10514' into 'master'
RED-10514: Different issues with specific download endpoints (incl. customer api endpoint)

Closes RED-10514

See merge request redactmanager/persistence-service!889
2024-11-25 10:21:15 +01:00
Maverick Studer
8b1f63bf45 RED-10514: Different issues with specific download endpoints (incl. customer api endpoint) 2024-11-25 10:21:15 +01:00
Corina Olariu
c9b8be9405 Merge branch 'RED-10443' into 'master'
RED-10443 - 500 Error occurs when selecting ISO-8859-1 as Encoding Type for any CSV File Format

Closes RED-10443

See merge request redactmanager/persistence-service!887
2024-11-22 11:49:31 +01:00
Maverick Studer
af234311e6 Merge branch 'RED-10518' into 'master'
RED-10518: System-managed entity has not defined rank after import

Closes RED-10518

See merge request redactmanager/persistence-service!886
2024-11-22 10:52:24 +01:00
corinaolariu
e5ea667ea1 RED-10443 - 500 Error occurs when selecting ISO-8859-1 as Encoding Type for any CSV File Format
- accept only the IS-8859-1 as encoding. Meaningful message (400) is returned in case of bad encoding
- update unit tests and add unit test
2024-11-22 11:32:18 +02:00
maverickstuder
7ff7222072 RED-10518: System-managed entity has not defined rank after import 2024-11-22 10:22:30 +01:00
yhampe
d8b1a32783 Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393 2024-11-22 08:44:38 +01:00
yhampe
3315a679a8 RED-9393 user stats controller
added authority check
2024-11-22 08:44:24 +01:00
yhampe
013d61b0d0 RED-9393 user stats controller
added action roles
2024-11-22 08:44:24 +01:00
yhampe
afe793a523 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-11-22 08:44:24 +01:00
yhampe
23078c0b66 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-22 08:44:00 +01:00
yhampe
c1fafaee6e RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-22 08:44:00 +01:00
yhampe
fa0e29095f RED-9393 user stats controller
added authority check
2024-11-22 08:43:29 +01:00
yhampe
c7a9c2ff11 RED-9393 user stats controller
added action roles
2024-11-22 08:38:51 +01:00
yhampe
072c965593 RED-9393 user stats controller
added filter for soft deleted dossiers
2024-11-21 09:00:30 +01:00
yhampe
8e40b77c70 Merge remote-tracking branch 'origin/feature/red-9393' into feature/red-9393 2024-11-20 11:31:06 +01:00
yhampe
4a32f55b61 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-20 11:31:00 +01:00
yhampe
12136e0fdc RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-20 11:31:00 +01:00
yhampe
8c7e64ffad RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-20 08:57:49 +01:00
yhampe
edd6b87566 RED-9393 user stats controller
added filter for hard deleted dossiers
2024-11-20 08:57:30 +01:00
110 changed files with 1912 additions and 1428 deletions

View File

@ -10,6 +10,7 @@ val redactionServiceVersion by rootProject.extra { "4.290.0" }
val pdftronRedactionServiceVersion by rootProject.extra { "4.90.0-RED10115.0" }
val redactionReportServiceVersion by rootProject.extra { "4.81.0" }
val searchServiceVersion by rootProject.extra { "2.90.0" }
val documentVersion by rootProject.extra { "4.433.0" }
repositories {
mavenLocal()

View File

@ -357,9 +357,9 @@ public class DictionaryController implements DictionaryResource {
public void changeFlags(@PathVariable(TYPE_PARAMETER_NAME) String type,
@PathVariable(DOSSIER_TEMPLATE_PARAMETER_NAME) String dossierTemplateId,
@PathVariable(value = DOSSIER_ID_PARAMETER_NAME) String dossierId,
@RequestParam(value = "addToDictionary") boolean addToDictionary) {
@RequestParam(value = "addToDictionaryAction") boolean addToDictionaryAction) {
dictionaryService.changeAddToDictionary(type, dossierTemplateId, dossierId, addToDictionary);
dictionaryService.changeAddToDictionary(type, dossierTemplateId, dossierId, addToDictionaryAction);
}

View File

@ -17,6 +17,7 @@ import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.DossierAttributeConfigMapper;
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.DossierAttributeConfigEntity;
import com.iqser.red.service.persistence.management.v1.processor.service.AccessControlService;
@ -52,7 +53,7 @@ public class DossierAttributesController implements DossierAttributesResource {
var result = MagicConverter.convert(dossierAttributeConfigPersistenceService.setDossierAttributesConfig(dossierTemplateId,
MagicConverter.convert(dossierAttributesConfig.getDossierAttributeConfigs(),
DossierAttributeConfigEntity.class)),
DossierAttributeConfig.class);
DossierAttributeConfig.class, new DossierAttributeConfigMapper());
auditPersistenceService.insertRecord(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
.objectId(dossierTemplateId)
@ -72,7 +73,7 @@ public class DossierAttributesController implements DossierAttributesResource {
var result = MagicConverter.convert(dossierAttributeConfigPersistenceService.addOrUpdateDossierAttribute(dossierTemplateId,
MagicConverter.convert(dossierAttribute,
DossierAttributeConfigEntity.class)),
DossierAttributeConfig.class);
DossierAttributeConfig.class, new DossierAttributeConfigMapper());
auditPersistenceService.insertRecord(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
.objectId(dossierTemplateId)
@ -118,7 +119,7 @@ public class DossierAttributesController implements DossierAttributesResource {
public DossierAttributesConfig getDossierAttributesConfig(@PathVariable(DOSSIER_TEMPLATE_ID) String dossierTemplateId) {
var result = dossierAttributeConfigPersistenceService.getDossierAttributes(dossierTemplateId);
return new DossierAttributesConfig(MagicConverter.convert(result, DossierAttributeConfig.class));
return new DossierAttributesConfig(MagicConverter.convert(result, DossierAttributeConfig.class, new DossierAttributeConfigMapper()));
}

View File

@ -28,6 +28,7 @@ import com.iqser.red.service.persistence.management.v1.processor.service.AccessC
import com.iqser.red.service.persistence.management.v1.processor.service.DossierManagementService;
import com.iqser.red.service.persistence.management.v1.processor.service.DownloadService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileManagementStorageService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusManagementService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.AuditPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.StringEncodingUtils;
@ -70,6 +71,7 @@ public class DownloadController implements DownloadResource {
private final OneTimeTokenService oneTimeTokenDownloadService;
private final AccessControlService accessControlService;
private final FileManagementStorageService fileManagementStorageService;
private final FileStatusManagementService fileStatusManagementService;
private final String REPORT_INFO = "/REPORT_INFO.json";
@ -136,7 +138,7 @@ public class DownloadController implements DownloadResource {
if (StringUtils.isBlank(dossierId)) {
throw new BadRequestException("Empty dossier id");
}
dossierService.getDossierById(dossierId, true, true);
dossierService.getDossierById(dossierId, true, false);
accessControlService.verifyUserIsDossierOwnerOrApprover(dossierId);
}
@ -146,11 +148,14 @@ public class DownloadController implements DownloadResource {
List<FileModel> validFiles = fileStatusService.getDossierStatus(request.getDossierId());
var fileIds = request.getFileIds();
if (fileIds != null && !fileIds.isEmpty()) { // validate the ids provided
if(fileIds.size() == 1) {
fileStatusManagementService.getFileStatus(fileIds.get(0), false);
}
validFiles = validFiles.stream()
.filter(f -> fileIds.contains(f.getId()))
.collect(Collectors.toList());
if (validFiles.isEmpty()) {
throw new NotFoundException("No file id provided is found");
throw new NotFoundException("No provided file id was found");
}
} // otherwise consider the files from dossier

View File

@ -15,6 +15,7 @@ import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.model.websocket.FileEventType;
import com.iqser.red.service.persistence.management.v1.processor.service.websocket.WebsocketService;
import com.iqser.red.service.persistence.management.v1.processor.utils.FileAttributeConfigMapper;
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.FileAttributesGeneralConfigurationEntity;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.FileAttributeConfigEntity;
@ -61,9 +62,9 @@ public class FileAttributesController implements FileAttributesResource {
}
fileAttributeConfigPersistenceService.setFileAttributesGeneralConfig(dossierTemplateId,
MagicConverter.convert(fileAttributesConfig, FileAttributesGeneralConfigurationEntity.class));
var result = fileAttributeConfigPersistenceService.setFileAttributesConfig(dossierTemplateId,
MagicConverter.convert(fileAttributesConfig.getFileAttributeConfigs(),
FileAttributeConfigEntity.class));
fileAttributeConfigPersistenceService.setFileAttributesConfig(dossierTemplateId,
MagicConverter.convert(fileAttributesConfig.getFileAttributeConfigs(), FileAttributeConfigEntity.class));
auditPersistenceService.audit(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
.objectId(dossierTemplateId)
@ -74,7 +75,9 @@ public class FileAttributesController implements FileAttributesResource {
.filenameMappingColumnHeaderName(fileAttributesConfig.getFilenameMappingColumnHeaderName())
.delimiter(fileAttributesConfig.getDelimiter())
.encoding(fileAttributesConfig.getEncoding())
.fileAttributeConfigs(MagicConverter.convert(result, FileAttributeConfig.class))
.fileAttributeConfigs(MagicConverter.convert(fileAttributeConfigPersistenceService.getFileAttributes(dossierTemplateId),
FileAttributeConfig.class,
new FileAttributeConfigMapper()))
.build();
}
@ -96,7 +99,7 @@ public class FileAttributesController implements FileAttributesResource {
dossierTemplateId))
.build());
return MagicConverter.convert(result, FileAttributeConfig.class);
return MagicConverter.convert(result, FileAttributeConfig.class, new FileAttributeConfigMapper());
}
@ -145,7 +148,7 @@ public class FileAttributesController implements FileAttributesResource {
.filenameMappingColumnHeaderName(generalConfig.getFilenameMappingColumnHeaderName())
.delimiter(generalConfig.getDelimiter())
.encoding(generalConfig.getEncoding())
.fileAttributeConfigs(MagicConverter.convert(fileAttributeConfigs, FileAttributeConfig.class))
.fileAttributeConfigs(MagicConverter.convert(fileAttributeConfigs, FileAttributeConfig.class, new FileAttributeConfigMapper()))
.build();
}

View File

@ -232,7 +232,7 @@ public class FileManagementController implements FileManagementResource {
public void restoreFiles(@PathVariable(DOSSIER_ID) String dossierId, @RequestBody Set<String> fileIds) {
accessControlService.checkAccessPermissionsToDossier(dossierId);
verifyUserIsDossierOwnerOrApproverOrAssignedReviewer(dossierId, fileIds);
accessControlService.verifyUserIsDossierOwnerOrApproverOrAssignedReviewer(dossierId, fileIds);
fileService.undeleteFiles(dossierId, fileIds);
auditPersistenceService.audit(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
@ -289,20 +289,4 @@ public class FileManagementController implements FileManagementResource {
}
}
private void verifyUserIsDossierOwnerOrApproverOrAssignedReviewer(String dossierId, Set<String> fileIds) {
try {
accessControlService.verifyUserIsDossierOwnerOrApprover(dossierId);
} catch (AccessDeniedException e1) {
try {
for (String fileId : fileIds) {
accessControlService.verifyUserIsReviewer(dossierId, fileId);
}
} catch (NotAllowedException e2) {
throw new NotAllowedException("User must be dossier owner, approver or assigned reviewer of the file.");
}
}
}
}

View File

@ -1,111 +0,0 @@
package com.iqser.red.persistence.service.v1.external.api.impl.controller;
import com.iqser.red.service.persistence.management.v1.processor.entity.migration.SaasMigrationStatusEntity;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.migration.SaasMigrationService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.SaasMigrationStatusPersistenceService;
import com.iqser.red.service.persistence.service.v1.api.external.resource.MigrationStatusResource;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus;
import com.iqser.red.service.persistence.service.v1.api.shared.model.saas.migration.MigrationStatusResponse;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.RestController;
import java.util.HashMap;
import java.util.Map;
import java.util.stream.Collectors;
import static com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus.*;
@RestController
@FieldDefaults(makeFinal = true, level = AccessLevel.PRIVATE)
@RequiredArgsConstructor
public class MigrationStatusController implements MigrationStatusResource {
SaasMigrationService saasMigrationService;
SaasMigrationStatusPersistenceService saasMigrationStatusPersistenceService;
FileStatusService fileStatusService;
public MigrationStatusResponse migrationStatus() {
int numberOfFilesToMigrate = saasMigrationStatusPersistenceService.countAll();
Map<SaasMigrationStatus, Integer> filesInStatus = new HashMap<>();
filesInStatus.put(MIGRATION_REQUIRED, saasMigrationStatusPersistenceService.countByStatus(MIGRATION_REQUIRED));
filesInStatus.put(DOCUMENT_FILES_MIGRATED, saasMigrationStatusPersistenceService.countByStatus(DOCUMENT_FILES_MIGRATED));
filesInStatus.put(REDACTION_LOGS_MIGRATED, saasMigrationStatusPersistenceService.countByStatus(REDACTION_LOGS_MIGRATED));
filesInStatus.put(ANNOTATION_IDS_MIGRATED, saasMigrationStatusPersistenceService.countByStatus(ANNOTATION_IDS_MIGRATED));
filesInStatus.put(FINISHED, saasMigrationStatusPersistenceService.countByStatus(FINISHED));
filesInStatus.put(ERROR, saasMigrationStatusPersistenceService.countByStatus(ERROR));
var filesInErrorState = saasMigrationStatusPersistenceService.findAllByStatus(ERROR);
var errorCauses = filesInErrorState.stream()
.collect(Collectors.toMap(errorFile -> errorFile.getDossierId() + "/" + errorFile.getFileId(), SaasMigrationStatusEntity::getErrorCause));
return MigrationStatusResponse.builder().numberOfFilesToMigrate(numberOfFilesToMigrate).filesInStatus(filesInStatus).errorCauses(errorCauses).build();
}
@Override
public ResponseEntity<?> startMigrationForFile(String dossierId, String fileId) {
if (!fileStatusService.fileExists(fileId)) {
throw new NotFoundException(String.format("File with id %s does not exist", fileId));
}
saasMigrationService.startMigrationForFile(dossierId, fileId);
return ResponseEntity.ok().build();
}
@Override
public ResponseEntity<?> revertMigrationForFile(String dossierId, String fileId) {
if (!fileStatusService.fileExists(fileId)) {
throw new NotFoundException(String.format("File with id %s does not exist", fileId));
}
if (!saasMigrationStatusPersistenceService.findById(fileId).getStatus().equals(FINISHED)) {
throw new BadRequestException(String.format("File with id %s is not migrated yet, can't revert.", fileId));
}
saasMigrationService.revertMigrationForFile(dossierId, fileId);
return ResponseEntity.ok().build();
}
@Override
public ResponseEntity<?> requeueErrorFiles() {
MigrationStatusResponse migrationStatus = migrationStatus();
if (!migrationIsFinished(migrationStatus)) {
throw new BadRequestException("There are still files processing, please wait until migration has finished to retry!");
}
saasMigrationService.requeueErrorFiles();
return ResponseEntity.ok().build();
}
private static boolean migrationIsFinished(MigrationStatusResponse migrationStatus) {
return migrationStatus.getFilesInStatus().entrySet()
.stream()
.filter(e -> e.getValue() > 0)
.allMatch(e -> e.getKey().equals(FINISHED) || e.getKey().equals(ERROR));
}
}

View File

@ -118,11 +118,12 @@ public class ReanalysisController implements ReanalysisResource {
@PreAuthorize("hasAuthority('" + REANALYZE_FILE + "')")
public void ocrFile(@PathVariable(DOSSIER_ID) String dossierId,
@PathVariable(FILE_ID) String fileId,
@RequestParam(value = FORCE_PARAM, required = false, defaultValue = FALSE) boolean force) {
@RequestParam(value = FORCE_PARAM, required = false, defaultValue = FALSE) boolean force,
@RequestParam(value = ALL_PAGES, required = false, defaultValue = FALSE) boolean allPages) {
accessControlService.checkDossierExistenceAndAccessPermissionsToDossier(dossierId);
validateOCR(dossierId, fileId);
reanalysisService.ocrFile(dossierId, fileId, force);
reanalysisService.ocrFile(dossierId, fileId, force, allPages);
auditPersistenceService.audit(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
.objectId(dossierId)
@ -140,7 +141,7 @@ public class ReanalysisController implements ReanalysisResource {
accessControlService.checkDossierExistenceAndAccessPermissionsToDossier(dossierId);
fileIds.forEach(fileId -> validateOCR(dossierId, fileId));
reanalysisService.ocrFiles(dossierId, fileIds);
reanalysisService.ocrFiles(dossierId, fileIds, false);
auditPersistenceService.audit(AuditRequest.builder()
.userId(KeycloakSecurity.getUserId())
.objectId(dossierId)

View File

@ -28,6 +28,7 @@ import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.acl.custom.dossier.DossierACLService;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import com.iqser.red.service.persistence.management.v1.processor.exception.ConflictException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotAllowedException;
import com.iqser.red.service.persistence.management.v1.processor.roles.ApplicationRoles;
import com.iqser.red.service.persistence.management.v1.processor.service.AccessControlService;
@ -101,8 +102,13 @@ public class StatusController implements StatusResource {
var accessibleDossierIds = filterByPermissionsService.onlyViewableDossierIds(new ArrayList<>(filesByDossier.getValue().keySet()));
var response = new HashMap<String, List<FileStatus>>();
for (var dossierId : accessibleDossierIds) {
var allFoundFiles = fileStatusManagementService.findAllDossierIdAndIds(dossierId, filesByDossier.getValue().get(dossierId));
response.put(dossierId, allFoundFiles.stream().map(FileStatusMapper::toFileStatus).collect(Collectors.toList()));
var allFoundFiles = fileStatusManagementService.findAllDossierIdAndIds(dossierId,
filesByDossier.getValue()
.get(dossierId));
response.put(dossierId,
allFoundFiles.stream()
.map(FileStatusMapper::toFileStatus)
.collect(Collectors.toList()));
}
return new JSONPrimitive<>(response);
@ -351,6 +357,10 @@ public class StatusController implements StatusResource {
.build());
var dossier = dossierACLService.enhanceDossierWithACLData(dossierManagementService.getDossierById(dossierId, false, false));
if (dossier.getOwnerId() == null) {
throw new ConflictException("Dossier has no owner!");
}
if (!dossier.getOwnerId().equals(KeycloakSecurity.getUserId())) {
var fileStatus = fileStatusManagementService.getFileStatus(fileId);

View File

@ -23,12 +23,11 @@ import org.springframework.web.bind.annotation.RequestPart;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.multipart.MultipartFile;
import com.iqser.red.service.persistence.management.v1.processor.service.FileFormatValidationService;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotAllowedException;
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.iqser.red.service.pdftron.redaction.v1.api.model.ByteContentDocument;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotAllowedException;
import com.iqser.red.service.persistence.management.v1.processor.service.AccessControlService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileFormatValidationService;
import com.iqser.red.service.persistence.management.v1.processor.service.ReanalysisService;
import com.iqser.red.service.persistence.management.v1.processor.service.UploadService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.AuditPersistenceService;
@ -37,6 +36,7 @@ import com.iqser.red.service.persistence.service.v1.api.external.resource.Upload
import com.iqser.red.service.persistence.service.v1.api.shared.model.AuditCategory;
import com.iqser.red.service.persistence.service.v1.api.shared.model.FileUploadResult;
import com.iqser.red.service.persistence.service.v1.api.shared.model.audit.AuditRequest;
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.knecon.fforesight.tenantcommons.TenantContext;
import feign.FeignException;
@ -53,9 +53,9 @@ import lombok.extern.slf4j.Slf4j;
@SuppressWarnings("PMD")
public class UploadController implements UploadResource {
private static final int THRESHOLD_ENTRIES = 10000;
private static final int THRESHOLD_SIZE = 1000000000; // 1 GB
private static final double THRESHOLD_RATIO = 10;
private static final int THRESHOLD_ENTRIES = 10000; // Maximum number of files allowed
private static final int THRESHOLD_SIZE = 1000000000; // 1 GB total unzipped data
private static final double THRESHOLD_RATIO = 10; // Max allowed compression ratio
private final UploadService uploadService;
private final ReanalysisService reanalysisService;
@ -72,31 +72,25 @@ public class UploadController implements UploadResource {
@Parameter(name = DISABLE_AUTOMATIC_ANALYSIS_PARAM, description = "Disables automatic redaction for the uploaded file, imports only imported redactions") @RequestParam(value = DISABLE_AUTOMATIC_ANALYSIS_PARAM, required = false, defaultValue = "false") boolean disableAutomaticAnalysis) {
accessControlService.checkAccessPermissionsToDossier(dossierId);
if (file.getOriginalFilename() == null) {
String originalFilename = file.getOriginalFilename();
if (originalFilename == null) {
throw new BadRequestException("Could not upload file, no filename provided.");
}
var extension = getExtension(file.getOriginalFilename());
String extension = getExtension(originalFilename);
try {
switch (extension) {
case "zip":
return handleZip(dossierId, file.getBytes(), keepManualRedactions, disableAutomaticAnalysis);
case "csv":
return uploadService.importCsv(dossierId, file.getBytes());
default:
if (!fileFormatValidationService.getAllFileFormats().contains(extension)) {
throw new BadRequestException("Invalid file uploaded");
}
if (!fileFormatValidationService.getValidFileFormatsForTenant(TenantContext.getTenantId()).contains(extension)) {
throw new NotAllowedException("Insufficient permissions");
}
return uploadService.processSingleFile(dossierId, file.getOriginalFilename(), file.getBytes(), keepManualRedactions, disableAutomaticAnalysis);
}
return switch (extension) {
case "zip" -> handleZip(dossierId, file.getBytes(), keepManualRedactions, disableAutomaticAnalysis);
case "csv" -> uploadService.importCsv(dossierId, file.getBytes());
default -> {
validateExtensionOrThrow(extension);
yield uploadService.processSingleFile(dossierId, originalFilename, file.getBytes(), keepManualRedactions, disableAutomaticAnalysis);
}
};
} catch (IOException e) {
throw new BadRequestException(e.getMessage(), e);
throw new BadRequestException("Failed to process file: " + e.getMessage(), e);
}
}
@ -111,7 +105,6 @@ public class UploadController implements UploadResource {
accessControlService.verifyUserIsReviewerOrApprover(dossierId, fileId);
try {
reanalysisService.importRedactions(ByteContentDocument.builder().dossierId(dossierId).fileId(fileId).document(file.getBytes()).pages(pageInclusionRequest).build());
auditPersistenceService.audit(AuditRequest.builder()
@ -122,84 +115,116 @@ public class UploadController implements UploadResource {
.details(Map.of("dossierId", dossierId))
.build());
} catch (IOException e) {
throw new BadRequestException(e.getMessage(), e);
throw new BadRequestException("Failed to import redactions: " + e.getMessage(), e);
} catch (FeignException e) {
throw processFeignException(e);
}
}
private String getExtension(String fileName) {
private void validateExtensionOrThrow(String extension) {
return fileName.substring(fileName.lastIndexOf(".") + 1).toLowerCase(Locale.ROOT);
if (!fileFormatValidationService.getAllFileFormats().contains(extension)) {
throw new BadRequestException("Invalid file uploaded (unrecognized extension).");
}
if (!fileFormatValidationService.getValidFileFormatsForTenant(TenantContext.getTenantId()).contains(extension)) {
throw new NotAllowedException("Insufficient permissions for this file type.");
}
}
/**
* 1. Write the uploaded content to a temp ZIP file
* 2. Check the number of entries and reject if too big or if symlinks found
* 3. Unzip and process each file, while checking size and ratio.
*/
private FileUploadResult handleZip(String dossierId, byte[] fileContent, boolean keepManualRedactions, boolean disableAutomaticAnalysis) throws IOException {
File tempFile = FileUtils.createTempFile(UUID.randomUUID().toString(), ".zip");
try (var fileOutputStream = new FileOutputStream(tempFile)) {
IOUtils.write(fileContent, fileOutputStream);
File tempZip = FileUtils.createTempFile(UUID.randomUUID().toString(), ".zip");
try (FileOutputStream fos = new FileOutputStream(tempZip)) {
IOUtils.write(fileContent, fos);
}
try {
checkForSymlinks(tempFile);
validateZipEntries(tempZip);
var zipData = unzip(tempFile, dossierId, keepManualRedactions, disableAutomaticAnalysis);
try {
ZipData zipData = processZipContents(tempZip, dossierId, keepManualRedactions, disableAutomaticAnalysis);
if (zipData.csvBytes != null) {
try {
var importResult = uploadService.importCsv(dossierId, zipData.csvBytes);
zipData.fileUploadResult.getProcessedAttributes().addAll(importResult.getProcessedAttributes());
zipData.fileUploadResult.getProcessedFileIds().addAll(importResult.getProcessedFileIds());
FileUploadResult csvResult = uploadService.importCsv(dossierId, zipData.csvBytes);
zipData.fileUploadResult.getProcessedAttributes().addAll(csvResult.getProcessedAttributes());
zipData.fileUploadResult.getProcessedFileIds().addAll(csvResult.getProcessedFileIds());
} catch (Exception e) {
log.debug("CSV file inside ZIP failed", e);
// TODO return un-processed files to client
log.debug("CSV file inside ZIP failed to import", e);
}
} else if (zipData.fileUploadResult.getFileIds().isEmpty()) {
if (zipData.containedUnpermittedFiles) {
throw new NotAllowedException("Zip file contains unpermitted files");
throw new NotAllowedException("Zip file contains unpermitted files.");
} else {
throw new BadRequestException("Only unsupported files in zip file");
throw new BadRequestException("Only unsupported files in the ZIP.");
}
}
return zipData.fileUploadResult;
} finally {
boolean isDeleted = tempFile.delete();
if (!isDeleted) {
log.warn("tempFile could not be deleted");
if (!tempZip.delete()) {
log.warn("Could not delete temporary ZIP file: {}", tempZip);
}
}
}
private void checkForSymlinks(File tempFile) throws IOException {
private void validateZipEntries(File tempZip) throws IOException {
try (FileInputStream fis = new FileInputStream(tempZip); ZipFile zipFile = new ZipFile(fis.getChannel())) {
int count = 0;
var entries = zipFile.getEntries();
while (entries.hasMoreElements()) {
ZipArchiveEntry ze = entries.nextElement();
try (var fis = new FileInputStream(tempFile); var zipFile = new ZipFile(fis.getChannel())) {
for (var entryEnum = zipFile.getEntries(); entryEnum.hasMoreElements(); ) {
var ze = entryEnum.nextElement();
if (ze.isUnixSymlink()) {
throw new BadRequestException("ZIP-files with symlinks are not allowed");
throw new BadRequestException("ZIP-files with symlinks are not allowed.");
}
if (!ze.isDirectory() && !ze.getName().startsWith(".")) {
count++;
if (count > THRESHOLD_ENTRIES) {
throw new BadRequestException("ZIP-Bomb detected: too many entries.");
}
}
}
}
}
private ZipData unzip(File tempFile, String dossierId, boolean keepManualRedactions, boolean disableAutomaticAnalysis) throws IOException {
private ZipData processZipContents(File tempZip, String dossierId, boolean keepManualRedactions, boolean disableAutomaticAnalysis) throws IOException {
var zipData = new ZipData();
ZipData zipData = new ZipData();
try (var fis = new FileInputStream(tempFile); var zipFile = new ZipFile(fis.getChannel())) {
try (FileInputStream fis = new FileInputStream(tempZip); ZipFile zipFile = new ZipFile(fis.getChannel())) {
for (var entryEnum = zipFile.getEntries(); entryEnum.hasMoreElements(); ) {
var ze = entryEnum.nextElement();
zipData.totalEntryArchive++;
var entries = zipFile.getEntries();
while (entries.hasMoreElements()) {
ZipArchiveEntry entry = entries.nextElement();
if (!ze.isDirectory()) {
processFileZipEntry(ze, zipFile, dossierId, keepManualRedactions, zipData, disableAutomaticAnalysis);
if (entry.isDirectory() || entry.getName().startsWith(".")) {
continue;
}
byte[] entryBytes = readEntryWithRatioCheck(entry, zipFile);
zipData.totalSizeArchive += entryBytes.length;
if (zipData.totalSizeArchive > THRESHOLD_SIZE) {
throw new BadRequestException("ZIP-Bomb detected (exceeds total size limit).");
}
String extension = getExtension(entry.getName());
if ("csv".equalsIgnoreCase(extension)) {
zipData.csvBytes = entryBytes;
} else {
handleRegularFile(dossierId, entryBytes, extension, extractFileName(entry.getName()), zipData, keepManualRedactions, disableAutomaticAnalysis);
}
}
}
@ -207,73 +232,70 @@ public class UploadController implements UploadResource {
}
private void processFileZipEntry(ZipArchiveEntry ze, ZipFile zipFile, String dossierId, boolean keepManualRedactions, ZipData zipData, boolean disableAutomaticAnalysis) throws IOException {
private byte[] readEntryWithRatioCheck(ZipArchiveEntry entry, ZipFile zipFile) throws IOException {
var extension = getExtension(ze.getName());
long compressedSize = entry.getCompressedSize() > 0 ? entry.getCompressedSize() : 1;
try (var is = zipFile.getInputStream(entry); var bos = new ByteArrayOutputStream()) {
final String fileName;
if (ze.getName().lastIndexOf("/") >= 0) {
fileName = ze.getName().substring(ze.getName().lastIndexOf("/") + 1);
} else {
fileName = ze.getName();
}
byte[] buffer = new byte[4096];
int bytesRead;
int totalUncompressed = 0;
if (fileName.startsWith(".")) {
return;
}
while ((bytesRead = is.read(buffer)) != -1) {
bos.write(buffer, 0, bytesRead);
totalUncompressed += bytesRead;
var entryAsBytes = readCurrentZipEntry(ze, zipFile);
zipData.totalSizeArchive += entryAsBytes.length;
// 1. the uncompressed data size is too much for the application resource capacity
// 2. too many entries in the archive can lead to inode exhaustion of the file-system
if (zipData.totalSizeArchive > THRESHOLD_SIZE || zipData.totalEntryArchive > THRESHOLD_ENTRIES) {
throw new BadRequestException("ZIP-Bomb detected.");
}
if ("csv".equals(extension)) {
zipData.csvBytes = entryAsBytes;
} else if (fileFormatValidationService.getAllFileFormats().contains(extension)) {
if (!fileFormatValidationService.getValidFileFormatsForTenant(TenantContext.getTenantId()).contains(extension)) {
zipData.containedUnpermittedFiles = true;
return;
}
zipData.containedUnpermittedFiles = false;
try {
var result = uploadService.processSingleFile(dossierId, fileName, entryAsBytes, keepManualRedactions, disableAutomaticAnalysis);
zipData.fileUploadResult.getFileIds().addAll(result.getFileIds());
} catch (Exception e) {
log.debug("PDF File inside ZIP failed", e);
// TODO return un-processed files to client
double ratio = (double) totalUncompressed / compressedSize;
if (ratio > THRESHOLD_RATIO) {
throw new BadRequestException("ZIP-Bomb detected (compression ratio too high).");
}
}
return bos.toByteArray();
}
}
private byte[] readCurrentZipEntry(ZipArchiveEntry ze, ZipFile zipFile) throws IOException {
private void handleRegularFile(String dossierId,
byte[] fileBytes,
String extension,
String fileName,
ZipData zipData,
boolean keepManualRedactions,
boolean disableAutomaticAnalysis) {
var bos = new ByteArrayOutputStream();
try (var entryStream = zipFile.getInputStream(ze)) {
var buffer = new byte[2048];
var nBytes = 0;
int totalSizeEntry = 0;
while ((nBytes = entryStream.read(buffer)) > 0) {
bos.write(buffer, 0, nBytes);
totalSizeEntry += nBytes;
double compressionRatio = (float) totalSizeEntry / ze.getCompressedSize();
if (compressionRatio > THRESHOLD_RATIO) {
// ratio between compressed and uncompressed data is highly suspicious, looks like a Zip Bomb Attack
throw new BadRequestException("ZIP-Bomb detected.");
}
}
if (!fileFormatValidationService.getAllFileFormats().contains(extension)) {
zipData.containedUnpermittedFiles = false;
return;
}
return bos.toByteArray();
if (!fileFormatValidationService.getValidFileFormatsForTenant(TenantContext.getTenantId()).contains(extension)) {
zipData.containedUnpermittedFiles = true;
return;
}
try {
FileUploadResult result = uploadService.processSingleFile(dossierId, fileName, fileBytes, keepManualRedactions, disableAutomaticAnalysis);
zipData.fileUploadResult.getFileIds().addAll(result.getFileIds());
} catch (Exception e) {
log.debug("Failed to process file '{}' in ZIP: {}", fileName, e.getMessage(), e);
}
}
private String extractFileName(String path) {
int idx = path.lastIndexOf('/');
return (idx >= 0) ? path.substring(idx + 1) : path;
}
private String getExtension(String fileName) {
int idx = fileName.lastIndexOf('.');
if (idx < 0) {
return "";
}
return fileName.substring(idx + 1).toLowerCase(Locale.ROOT);
}
@ -282,7 +304,6 @@ public class UploadController implements UploadResource {
byte[] csvBytes;
int totalSizeArchive;
int totalEntryArchive;
FileUploadResult fileUploadResult = new FileUploadResult();
boolean containedUnpermittedFiles;

View File

@ -1,13 +1,17 @@
package com.iqser.red.persistence.service.v1.external.api.impl.controller;
import static com.iqser.red.service.persistence.management.v1.processor.roles.ActionRoles.READ_USER_STATS;
import java.util.ArrayList;
import java.util.List;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.security.access.prepost.PreAuthorize;
import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.acl.custom.dossier.DossierACLService;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.service.DossierService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.FileStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.users.UserService;
@ -29,15 +33,17 @@ public class UserStatsController implements UserStatsResource {
@Override
@PreAuthorize("hasAuthority('" + READ_USER_STATS + "')")
public ResponseEntity<UserStats> getUserStats(String userId) {
if (userService.getUserById(userId).isEmpty()) {
return new ResponseEntity(null, HttpStatus.NOT_FOUND);
throw new NotFoundException(String.format("The user with id %s is not found.", userId));
}
List<String> dossierMemberships = new ArrayList<>();
List<String> dossierOwnerships = new ArrayList<>();
dossierService.getAllDossiers()
.stream()
.filter(dossierEntity -> dossierEntity.getHardDeletedTime() == null)
.forEach(d -> {
if (dossierACLService.getMembers(d.getId()).contains(userId)) {
dossierMemberships.add(d.getId());

View File

@ -7,19 +7,24 @@ import static com.iqser.red.service.persistence.service.v2.api.external.resource
import java.util.ArrayList;
import java.util.Optional;
import java.util.Set;
import java.util.stream.Collectors;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.security.access.AccessDeniedException;
import org.springframework.security.access.prepost.PreAuthorize;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.persistence.service.v1.external.api.impl.controller.DossierController;
import com.iqser.red.persistence.service.v1.external.api.impl.controller.StatusController;
import com.iqser.red.persistence.service.v2.external.api.impl.mapper.ComponentMapper;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotAllowedException;
import com.iqser.red.service.persistence.management.v1.processor.roles.ApplicationRoles;
import com.iqser.red.service.persistence.management.v1.processor.service.AccessControlService;
import com.iqser.red.service.persistence.management.v1.processor.service.ComponentLogService;
import com.iqser.red.service.persistence.management.v1.processor.service.CurrentApplicationTypeProvider;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
@ -29,6 +34,7 @@ import com.iqser.red.service.persistence.management.v1.processor.service.users.m
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.componentlog.ComponentLogEntry;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.RevertOverrideRequest;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileModel;
import com.iqser.red.service.persistence.service.v2.api.external.model.BulkComponentsRequest;
import com.iqser.red.service.persistence.service.v2.api.external.model.Component;
import com.iqser.red.service.persistence.service.v2.api.external.model.ComponentOverrideList;
import com.iqser.red.service.persistence.service.v2.api.external.model.FileComponents;
@ -37,6 +43,7 @@ import com.iqser.red.service.persistence.service.v2.api.external.resource.Compon
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.knecon.fforesight.tenantcommons.TenantProvider;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.tags.Tag;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
@ -47,15 +54,21 @@ import lombok.experimental.FieldDefaults;
@Tag(name = "4. Component endpoints", description = "Provides operations related to components")
public class ComponentControllerV2 implements ComponentResource {
private final AccessControlService accessControlService;
private final ComponentLogService componentLogService;
private final UserService userService;
private final StatusController statusController;
private final FileStatusService fileStatusService;
private final DossierController dossierController;
private final DossierTemplatePersistenceService dossierTemplatePersistenceService;
private final CurrentApplicationTypeProvider currentApplicationTypeProvider;
private final ComponentMapper componentMapper = ComponentMapper.INSTANCE;
@Value("${documine.components.filesLimit:100}")
private int documineComponentsFilesLimit = 100;
@Override
public FileComponents getComponents(@PathVariable(DOSSIER_TEMPLATE_ID_PARAM) String dossierTemplateId,
@PathVariable(DOSSIER_ID_PARAM) String dossierId,
@ -92,12 +105,37 @@ public class ComponentControllerV2 implements ComponentResource {
checkApplicationType();
dossierTemplatePersistenceService.checkDossierTemplateExistsOrElseThrow404(dossierTemplateId);
var dossierFiles = statusController.getDossierStatus(dossierId);
if(dossierFiles.size() > documineComponentsFilesLimit) {
throw new BadRequestException(String.format("The dossier you requested components for contains %s files this is above the limit of %s files for this endpoint, please use the POST %s", dossierFiles.size(), documineComponentsFilesLimit, FILE_PATH + BULK_COMPONENTS_PATH));
}
return new FileComponentsList(dossierFiles.stream()
.map(file -> getComponents(dossierTemplateId, dossierId, file.getFileId(), includeDetails))
.toList());
}
@Override
public FileComponentsList getComponentsForFiles(@PathVariable(DOSSIER_TEMPLATE_ID_PARAM) String dossierTemplateId,
@PathVariable(DOSSIER_ID_PARAM) String dossierId,
@RequestParam(name = INCLUDE_DETAILS_PARAM, defaultValue = "false", required = false) boolean includeDetails,
@RequestBody BulkComponentsRequest bulkComponentsRequest){
if(bulkComponentsRequest.getFileIds().size() > documineComponentsFilesLimit) {
throw new BadRequestException(String.format("You requested components for %s files this is above the limit of %s files for this endpoint, lower the fileIds in the request", bulkComponentsRequest.getFileIds().size(), documineComponentsFilesLimit));
}
checkApplicationType();
dossierTemplatePersistenceService.checkDossierTemplateExistsOrElseThrow404(dossierTemplateId);
dossierController.getDossier(dossierId, false, false);
return new FileComponentsList(bulkComponentsRequest.getFileIds().stream()
.map(fileId -> getComponents(dossierTemplateId, dossierId, fileId, includeDetails))
.toList());
}
@Override
@PreAuthorize("hasAuthority('" + GET_RSS + "')")
public void addOverride(@PathVariable(DOSSIER_TEMPLATE_ID_PARAM) String dossierTemplateId,
@ -106,6 +144,7 @@ public class ComponentControllerV2 implements ComponentResource {
@RequestBody Component override) {
checkApplicationType();
accessControlService.verifyUserIsReviewer(dossierId, fileId);
dossierTemplatePersistenceService.checkDossierTemplateExistsOrElseThrow404(dossierTemplateId);
componentLogService.overrideComponent(dossierId, fileId, componentMapper.toComponentLogEntry(override));
@ -146,6 +185,7 @@ public class ComponentControllerV2 implements ComponentResource {
@RequestBody RevertOverrideRequest revertOverrideRequest) {
checkApplicationType();
accessControlService.verifyUserIsReviewer(dossierId, fileId);
dossierTemplatePersistenceService.checkDossierTemplateExistsOrElseThrow404(dossierTemplateId);
componentLogService.revertOverrides(dossierId, fileId, revertOverrideRequest);

View File

@ -247,6 +247,7 @@ public class DossierTemplateControllerV2 implements DossierTemplateResource {
.filterable(fileAttributeConfig.isFilterable())
.displayedInFileList(fileAttributeConfig.isDisplayedInFileList())
.build())
.includeInCsvExport(fileAttributeConfig.isIncludeInCsvExport())
.build())
.toList();
@ -449,11 +450,15 @@ public class DossierTemplateControllerV2 implements DossierTemplateResource {
return new DossierAttributeDefinitionList(dossierAttributeConfigPersistenceService.getDossierAttributes(dossierTemplateId)
.stream()
.map(config -> DossierAttributeDefinition.builder()
.id(config.getId())
.name(config.getLabel())
.type(config.getType())
.reportingPlaceholder(config.getPlaceholder())
.displaySettings(DossierAttributeDefinition.DossierDisplaySettings.builder()
.editable(config.isEditable())
.filterable(config.isFilterable())
.displayedInDossierList(config.isDisplayedInDossierList())
.build())
.build())
.toList());
}

View File

@ -156,8 +156,6 @@ public class FileControllerV2 implements FileResource {
@PathVariable(FILE_ID_PARAM) String fileId,
@RequestBody DownloadRequest downloadRequest) {
fileStatusManagementService.getFileStatus(fileId, false);
return prepareBulkDownload(dossierTemplateId,
dossierId,
BulkDownloadRequest.builder()

View File

@ -223,7 +223,7 @@ public interface DictionaryResource {
void changeFlags(@PathVariable(TYPE_PARAMETER_NAME) String type,
@PathVariable(DOSSIER_TEMPLATE_PARAMETER_NAME) String dossierTemplateId,
@PathVariable(DOSSIER_ID_PARAMETER_NAME) String dossierId,
@RequestParam(value = "addToDictionary") boolean addToDictionary);
@RequestParam(value = "addToDictionaryAction") boolean addToDictionaryAction);
@PostMapping(value = DICTIONARY_REST_PATH + DIFFERENCE + DOSSIER_TEMPLATE_PATH_VARIABLE, consumes = MediaType.APPLICATION_JSON_VALUE)

View File

@ -44,7 +44,11 @@ public interface FileAttributesResource {
String FILE_ID = "fileId";
String FILE_ID_PATH_VARIABLE = "/{" + FILE_ID + "}";
Set<String> encodingList = Sets.newHashSet("ISO", "ASCII", "UTF-8");
String UTF_ENCODING = "UTF-8";
String ASCII_ENCODING = "ASCII";
String ISO_ENCODING = "ISO-8859-1";
Set<String> encodingList = Sets.newHashSet(ISO_ENCODING, ASCII_ENCODING, UTF_ENCODING);
@ResponseBody

View File

@ -1,56 +0,0 @@
package com.iqser.red.service.persistence.service.v1.api.external.resource;
import com.iqser.red.service.persistence.service.v1.api.shared.model.saas.migration.MigrationStatusResponse;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
public interface MigrationStatusResource {
String MIGRATION_STATUS_REST_PATH = ExternalApi.BASE_PATH + "/migration-status";
String START_MIGRATION_REST_PATH = ExternalApi.BASE_PATH + "/start_migration";
String REVERT_MIGRATION_REST_PATH = ExternalApi.BASE_PATH + "/revert_migration";
String RETRY_MIGRATION_REST_PATH = ExternalApi.BASE_PATH + "/retry_migration";
String FILE_ID = "fileId";
String FILE_ID_PATH_VARIABLE = "/{" + FILE_ID + "}";
String DOSSIER_ID = "dossierId";
String DOSSIER_ID_PATH_VARIABLE = "/{" + DOSSIER_ID + "}";
@ResponseBody
@PostMapping(value = MIGRATION_STATUS_REST_PATH, produces = MediaType.APPLICATION_JSON_VALUE)
@Operation(summary = "Show the status of the migration", description = "None")
@ApiResponses(value = {@ApiResponse(responseCode = "200", description = "Success.")})
MigrationStatusResponse migrationStatus();
@ResponseBody
@PostMapping(value = START_MIGRATION_REST_PATH + FILE_ID_PATH_VARIABLE + DOSSIER_ID_PATH_VARIABLE)
@Operation(summary = "Start SAAS migration for specific file", description = "None")
@ApiResponses(value = {@ApiResponse(responseCode = "200", description = "Success.")})
ResponseEntity<?> startMigrationForFile(@RequestParam(value = DOSSIER_ID) String dossierId, @RequestParam(value = FILE_ID) String fileId);
@ResponseBody
@PostMapping(value = REVERT_MIGRATION_REST_PATH + FILE_ID_PATH_VARIABLE + DOSSIER_ID_PATH_VARIABLE)
@Operation(summary = "Start SAAS migration for specific file", description = "None")
@ApiResponses(value = {@ApiResponse(responseCode = "200", description = "Success.")})
ResponseEntity<?> revertMigrationForFile(@RequestParam(value = DOSSIER_ID) String dossierId, @RequestParam(value = FILE_ID) String fileId);
@ResponseBody
@PostMapping(value = RETRY_MIGRATION_REST_PATH)
@Operation(summary = "Restart SAAS migration for all files in error state", description = "None")
@ApiResponses(value = {@ApiResponse(responseCode = "200", description = "Success.")})
ResponseEntity<?> requeueErrorFiles();
}

View File

@ -38,6 +38,7 @@ public interface ReanalysisResource {
String EXCLUDED_STATUS_PARAM = "excluded";
String FORCE_PARAM = "force";
String ALL_PAGES = "allPages";
@PostMapping(value = REANALYSIS_REST_PATH + DOSSIER_ID_PATH_VARIABLE)
@ -73,7 +74,8 @@ public interface ReanalysisResource {
@ApiResponses(value = {@ApiResponse(responseCode = "204", description = "OK"), @ApiResponse(responseCode = "409", description = "Conflict"), @ApiResponse(responseCode = "404", description = "Not found"), @ApiResponse(responseCode = "403", description = "Forbidden"), @ApiResponse(responseCode = "400", description = "Cannot OCR approved file")})
void ocrFile(@PathVariable(DOSSIER_ID) String dossierId,
@PathVariable(FILE_ID) String fileId,
@RequestParam(value = FORCE_PARAM, required = false, defaultValue = FALSE) boolean force);
@RequestParam(value = FORCE_PARAM, required = false, defaultValue = FALSE) boolean force,
@RequestParam(value = ALL_PAGES, required = false, defaultValue = FALSE) boolean allPages);
@Operation(summary = "Ocr and reanalyze multiple files for a dossier", description = "None")

View File

@ -0,0 +1,18 @@
package com.iqser.red.service.persistence.service.v2.api.external.model;
import java.util.ArrayList;
import java.util.List;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class BulkComponentsRequest {
private List<String> fileIds = new ArrayList<>();
}

View File

@ -17,5 +17,18 @@ public class DossierAttributeDefinition {
private String name;
private DossierAttributeType type;
private String reportingPlaceholder;
private DossierDisplaySettings displaySettings;
@Data
@NoArgsConstructor
@AllArgsConstructor
@Builder
public static class DossierDisplaySettings {
private boolean editable;
private boolean filterable;
private boolean displayedInDossierList;
}
}

View File

@ -20,6 +20,7 @@ public class FileAttributeDefinition {
private String mappedCsvColumnHeader;
private String reportingPlaceholder;
private DisplaySettings displaySettings;
private boolean includeInCsvExport;
@Data
@NoArgsConstructor

View File

@ -20,6 +20,7 @@ import org.springframework.web.bind.annotation.ResponseBody;
import org.springframework.web.bind.annotation.ResponseStatus;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.RevertOverrideRequest;
import com.iqser.red.service.persistence.service.v2.api.external.model.BulkComponentsRequest;
import com.iqser.red.service.persistence.service.v2.api.external.model.Component;
import com.iqser.red.service.persistence.service.v2.api.external.model.ComponentOverrideList;
import com.iqser.red.service.persistence.service.v2.api.external.model.FileComponents;
@ -74,6 +75,16 @@ public interface ComponentResource {
@Parameter(name = INCLUDE_DETAILS_PARAM, description = INCLUDE_DETAILS_DESCRIPTION) @RequestParam(name = INCLUDE_DETAILS_PARAM, defaultValue = "false", required = false) boolean includeDetails);
@PostMapping(value = FILE_PATH
+ BULK_COMPONENTS_PATH, produces = {MediaType.APPLICATION_JSON_VALUE, MediaType.APPLICATION_XML_VALUE}, consumes = MediaType.APPLICATION_JSON_VALUE)
@Operation(summary = "Returns the components for all files of a dossier", description = "None")
@ApiResponses(value = {@ApiResponse(responseCode = "200", description = "OK")})
FileComponentsList getComponentsForFiles(@Parameter(name = DOSSIER_TEMPLATE_ID_PARAM, description = "The identifier of the dossier template that is used for the dossier.", required = true) @PathVariable(DOSSIER_TEMPLATE_ID_PARAM) String dossierTemplateId,
@Parameter(name = DOSSIER_ID_PARAM, description = "The identifier of the dossier that contains the file.", required = true) @PathVariable(DOSSIER_ID_PARAM) String dossierId,
@Parameter(name = INCLUDE_DETAILS_PARAM, description = INCLUDE_DETAILS_DESCRIPTION) @RequestParam(name = INCLUDE_DETAILS_PARAM, defaultValue = "false", required = false) boolean includeDetails,
@RequestBody BulkComponentsRequest bulkComponentsRequest);
@ResponseBody
@ResponseStatus(value = HttpStatus.NO_CONTENT)
@PostMapping(value = FILE_PATH + FILE_ID_PATH_VARIABLE + OVERRIDES_PATH, consumes = MediaType.APPLICATION_JSON_VALUE)

View File

@ -1626,6 +1626,50 @@ paths:
$ref: '#/components/responses/429'
"500":
$ref: '#/components/responses/500'
post:
operationId: getComponentsForFiles
tags:
- 4. Components
summary: Returns the FileComponents for requested files
description: |
This endpoint fetches components for the requested files by its ids. Like individual file components,
these represent various aspects, metadata or content of the files. Entity and component rules define these components based on the file's
content. They can give a *structured view* on a document's text.
To include detailed component information, set the `includeDetails` query parameter to `true`.
parameters:
- $ref: '#/components/parameters/dossierTemplateId'
- $ref: '#/components/parameters/dossierId'
- $ref: '#/components/parameters/includeComponentDetails'
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/BulkComponentsRequest'
required: true
responses:
"200":
content:
application/json:
schema:
$ref: '#/components/schemas/FileComponentsList'
application/xml:
schema:
$ref: '#/components/schemas/FileComponentsList'
description: |
Successfully fetched components for all files in the dossier.
"400":
$ref: '#/components/responses/400'
"401":
$ref: '#/components/responses/401'
"403":
$ref: '#/components/responses/403'
"404":
$ref: '#/components/responses/404-dossier'
"429":
$ref: '#/components/responses/429'
"500":
$ref: '#/components/responses/500'
/api/downloads:
get:
operationId: getDownloadStatusList
@ -2773,6 +2817,16 @@ components:
entityRuleId: DEF.13.37
type: another_entity_type
page: 456
BulkComponentsRequest:
type: object
description: Request payload to get components for multiple files.
properties:
fileIds:
type: array
description: A list with unique identifiers of the files for which components should be retrieved.
items:
type: string
description: The unique identifier of a file.
DossierStatusDefinition:
type: object
description: |
@ -2864,6 +2918,8 @@ components:
placeholder follows a specific format convention:
`{{dossier.attribute.<name>}}` while the name is transformed into 'PascalCase' and does not contain
whitespaces. The placeholder is unique in a dossier template.
displaySettings:
$ref: '#/components/schemas/DossierAttributeDisplaySettings'
required:
- name
- type
@ -2872,6 +2928,35 @@ components:
name: "Document Summary"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.DocumentSummary}}"
displaySettings:
editable: true
filterable: false
displayedInDossierList: false
DossierAttributeDisplaySettings:
type: object
description: |
Display setting for the user interface. These settings control how the UI handles and presents the dossier attributes.
properties:
editable:
type: boolean
description: |
If `true`, the user interfaces allow manual editing of the value. Otherwise only importing and setting by rules would be possible.
filterable:
type: boolean
description: |
If `true`, the user interfaces add filter options to the dossier list.
displayedInDossierList:
type: boolean
description: |
if `true`, the user interfaces show the values in the dossier list.
required:
- editable
- filterable
- displayedInDossierList
example:
editable: true
filterable: true
displayedInDossierList: false
FileAttributeDefinition:
type: object
description: |
@ -2994,10 +3079,18 @@ components:
name: "Dossier Summary"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.DossierSummary}}"
displaySettings:
editable: true
filterable: false
displayedInFileList: false
- id: "23e45678-e90b-12d3-a456-765114174321"
name: "Comment"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.Comment}}"
displaySettings:
editable: true
filterable: false
displayedInFileList: false
FileAttributeDefinitionList:
type: object
description: A list of file attribute definitions.

View File

@ -1492,6 +1492,8 @@ components:
placeholder follows a specific format convention:
`{{dossier.attribute.<name>}}` while the name is transformed into 'PascalCase' and does not contain
whitespaces. The placeholder is unique in a dossier template.
displaySettings:
$ref: '#/components/schemas/DossierAttributeDisplaySettings'
required:
- name
- type
@ -1500,6 +1502,35 @@ components:
name: "Document Summary"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.DocumentSummary}}"
displaySettings:
editable: true
filterable: false
displayedInDossierList: false
DossierAttributeDisplaySettings:
type: object
description: |
Display setting for the user interface. These settings control how the UI handles and presents the dossier attributes.
properties:
editable:
type: boolean
description: |
If `true`, the user interfaces allow manual editing of the value. Otherwise only importing and setting by rules would be possible.
filterable:
type: boolean
description: |
If `true`, the user interfaces add filter options to the dossier list.
displayedInDossierList:
type: boolean
description: |
if `true`, the user interfaces show the values in the dossier list.
required:
- editable
- filterable
- displayedInDossierList
example:
editable: true
filterable: true
displayedInDossierList: false
FileAttributeDefinition:
type: object
description: |
@ -1622,10 +1653,18 @@ components:
name: "Dossier Summary"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.DossierSummary}}"
displaySettings:
editable: true
filterable: false
displayedInFileList: false
- id: "23e45678-e90b-12d3-a456-765114174321"
name: "Comment"
type: "TEXT"
reportingPlaceholder: "{{dossier.attribute.Comment}}"
displaySettings:
editable: true
filterable: false
displayedInFileList: false
FileAttributeDefinitionList:
type: object
description: A list of file attribute definitions.

View File

@ -10,6 +10,7 @@ import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.DossierAttributeConfigEntity;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierAttributeConfigPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.DossierAttributeConfigMapper;
import com.iqser.red.service.persistence.service.v1.api.internal.resources.DossierAttributesConfigResource;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.DossierAttributeConfig;
@ -27,7 +28,7 @@ public class DossierAttributesConfigInternalController implements DossierAttribu
@Override
public List<DossierAttributeConfig> getDossierAttributes(@PathVariable(DOSSIER_TEMPLATE_ID) String dossierTemplateId) {
return convert(dossierAttributeConfigPersistenceService.getDossierAttributes(dossierTemplateId), DossierAttributeConfig.class);
return convert(dossierAttributeConfigPersistenceService.getDossierAttributes(dossierTemplateId), DossierAttributeConfig.class, new DossierAttributeConfigMapper());
}
}

View File

@ -8,6 +8,7 @@ import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.FileAttributeConfigPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.FileAttributeConfigMapper;
import com.iqser.red.service.persistence.service.v1.api.internal.resources.FileAttributesConfigResource;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileAttributeConfig;
@ -25,7 +26,7 @@ public class FileAttributesConfigInternalController implements FileAttributesCon
@Override
public List<FileAttributeConfig> getFileAttributeConfigs(@PathVariable(DOSSIER_TEMPLATE_ID) String dossierTemplateId) {
return convert(fileAttributeConfigPersistenceService.getFileAttributes(dossierTemplateId), FileAttributeConfig.class);
return convert(fileAttributeConfigPersistenceService.getFileAttributes(dossierTemplateId), FileAttributeConfig.class, new FileAttributeConfigMapper());
}
}

View File

@ -57,7 +57,7 @@ public class AdminInterfaceController {
fileStatusService.validateFileIsNotDeletedAndNotApproved(fileId);
fileStatusService.setStatusOcrQueued(dossierId, fileId);
fileStatusService.setStatusOcrQueued(dossierId, fileId, false);
}
@ -91,7 +91,7 @@ public class AdminInterfaceController {
if (!dryRun) {
fileStatusService.validateFileIsNotDeletedAndNotApproved(file.getId());
fileStatusService.setStatusOcrQueued(file.getDossierId(), file.getId());
fileStatusService.setStatusOcrQueued(file.getDossierId(), file.getId(), false);
}
}

View File

@ -14,6 +14,7 @@ configurations {
}
}
dependencies {
api(project(":persistence-service-shared-api-v1"))
api(project(":persistence-service-shared-mongo-v1"))
api(project(":persistence-service-external-api-v1"))
@ -36,12 +37,12 @@ dependencies {
}
api("com.knecon.fforesight:azure-ocr-service-api:0.13.0")
implementation("com.knecon.fforesight:llm-service-api:1.20.0-RED10072.2")
api("com.knecon.fforesight:jobs-commons:0.10.0")
api("com.knecon.fforesight:jobs-commons:0.13.0")
api("com.iqser.red.commons:storage-commons:2.50.0")
api("com.knecon.fforesight:tenant-commons:0.31.0-RED10196.0") {
exclude(group = "com.iqser.red.commons", module = "storage-commons")
}
api("com.knecon.fforesight:database-tenant-commons:0.28.0-RED10196.0") {
api("com.knecon.fforesight:database-tenant-commons:0.31.0") {
exclude(group = "com.knecon.fforesight", module = "tenant-commons")
}
api("com.knecon.fforesight:keycloak-commons:0.30.0") {
@ -70,7 +71,6 @@ dependencies {
api("commons-validator:commons-validator:1.7")
api("com.opencsv:opencsv:5.9")
implementation("com.google.protobuf:protobuf-java:4.27.1")
implementation("org.mapstruct:mapstruct:1.6.2")
annotationProcessor("org.mapstruct:mapstruct-processor:1.6.2")

View File

@ -53,6 +53,8 @@ import com.iqser.red.service.persistence.management.v1.processor.service.persist
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.LegalBasisMappingPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.ReportTemplatePersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.RulesPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.DossierAttributeConfigMapper;
import com.iqser.red.service.persistence.management.v1.processor.utils.FileAttributeConfigMapper;
import com.iqser.red.service.persistence.management.v1.processor.utils.FileSystemBackedArchiver;
import com.iqser.red.service.persistence.management.v1.processor.utils.StorageIdUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.RuleFileType;
@ -183,13 +185,16 @@ public class DossierTemplateExportService {
fileSystemBackedArchiver.addEntries(new FileSystemBackedArchiver.ArchiveModel(folder,
getFilename(ExportFilename.DOSSIER_ATTRIBUTES_CONFIG, JSON_EXT),
objectMapper.writeValueAsBytes(convert(dossierAttributesConfig,
DossierAttributeConfig.class))));
DossierAttributeConfig.class,
new DossierAttributeConfigMapper()))));
// add file attribute configs
List<FileAttributeConfigEntity> fileAttributeConfigList = fileAttributeConfigPersistenceService.getFileAttributes(dossierTemplate.getId());
fileSystemBackedArchiver.addEntries(new FileSystemBackedArchiver.ArchiveModel(folder,
getFilename(ExportFilename.FILE_ATTRIBUTE_CONFIG, JSON_EXT),
objectMapper.writeValueAsBytes(convert(fileAttributeConfigList, FileAttributeConfig.class))));
objectMapper.writeValueAsBytes(convert(fileAttributeConfigList,
FileAttributeConfig.class,
new FileAttributeConfigMapper()))));
// add legal basis mapping
List<LegalBasisEntity> legalBasisMappingList = legalBasisMappingPersistenceService.getLegalBasisMapping(dossierTemplate.getId());

View File

@ -0,0 +1,39 @@
package com.iqser.red.service.persistence.management.v1.processor.entity.configuration;
import java.time.OffsetDateTime;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.Id;
import jakarta.persistence.IdClass;
import jakarta.persistence.Table;
import jakarta.validation.constraints.NotNull;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.experimental.FieldDefaults;
@Data
@Entity
@Builder
@AllArgsConstructor
@NoArgsConstructor
@IdClass(AppVersionEntityKey.class)
@Table(name = "app_version_history")
@FieldDefaults(level = AccessLevel.PRIVATE)
public class AppVersionEntity {
@Id
@Column
@NotNull String appVersion;
@Id
@Column
@NotNull String layoutParserVersion;
@Column
@NotNull OffsetDateTime date;
}

View File

@ -0,0 +1,27 @@
package com.iqser.red.service.persistence.management.v1.processor.entity.configuration;
import java.io.Serializable;
import jakarta.persistence.Column;
import jakarta.validation.constraints.NotNull;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.experimental.FieldDefaults;
@Data
@NoArgsConstructor
@AllArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
public class AppVersionEntityKey implements Serializable {
@Column
@NotNull String appVersion;
@Column
@NotNull String layoutParserVersion;
}

View File

@ -30,6 +30,10 @@ public class DossierAttributeConfigEntity {
@Column
private boolean editable;
@Column
private boolean filterable;
@Column
private boolean displayedInDossierList;
@Column
private String placeholder;
@Column

View File

@ -41,6 +41,8 @@ public class FileAttributeConfigEntity {
@Column
private String placeholder;
@Column
private boolean includeInCsvExport;
@Column
@Enumerated(EnumType.STRING)
@Builder.Default
private FileAttributeType type = FileAttributeType.TEXT;

View File

@ -9,6 +9,7 @@ import java.util.Set;
import org.hibernate.annotations.Fetch;
import org.hibernate.annotations.FetchMode;
import com.iqser.red.service.persistence.management.v1.processor.entity.download.DownloadStatusEntity;
import com.iqser.red.service.persistence.management.v1.processor.utils.JSONIntegerSetConverter;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.ErrorCode;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.ProcessingStatus;
@ -22,8 +23,10 @@ import jakarta.persistence.Entity;
import jakarta.persistence.EnumType;
import jakarta.persistence.Enumerated;
import jakarta.persistence.FetchType;
import jakarta.persistence.ForeignKey;
import jakarta.persistence.Id;
import jakarta.persistence.JoinColumn;
import jakarta.persistence.ManyToOne;
import jakarta.persistence.OneToMany;
import jakarta.persistence.Table;
import lombok.AllArgsConstructor;
@ -74,6 +77,9 @@ public class FileEntity {
@Column
private OffsetDateTime lastLayoutProcessed;
@Column
private String layoutParserVersion;
@Column
private OffsetDateTime lastIndexed;
@ -218,6 +224,11 @@ public class FileEntity {
@Column
private boolean protobufMigrationDone;
@ManyToOne(fetch = FetchType.EAGER)
@JoinColumn(name = "last_download", referencedColumnName = "storage_id", foreignKey = @ForeignKey(name = "fk_file_last_download"))
private DownloadStatusEntity lastDownload;
public OffsetDateTime getLastOCRTime() {
return this.ocrStartTime;

View File

@ -43,6 +43,7 @@ import lombok.experimental.FieldDefaults;
public class DownloadStatusEntity {
@Id
@Column(name = "storage_id")
String storageId;
@Column
String uuid;

View File

@ -1,35 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.entity.migration;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus;
import jakarta.persistence.*;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@Builder
@AllArgsConstructor
@NoArgsConstructor
@Entity
@Table(name = "saas_migration_status")
public class SaasMigrationStatusEntity {
@Id
private String fileId;
@Column
private String dossierId;
@Column
@Enumerated(EnumType.STRING)
private SaasMigrationStatus status;
@Column
private Integer processingErrorCounter;
@Column
private String errorCause;
}

View File

@ -0,0 +1,61 @@
package com.iqser.red.service.persistence.management.v1.processor.lifecycle;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.event.EventListener;
import org.springframework.stereotype.Component;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.AppVersionPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.TenantUtils;
import com.knecon.fforesight.tenantcommons.TenantContext;
import com.knecon.fforesight.tenantcommons.TenantProvider;
import jakarta.validation.ConstraintViolationException;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
@Slf4j
@Component
@RequiredArgsConstructor
public class AppVersionTracker {
@Value("${BACKEND_APP_VERSION:}")
private String appVersion;
@Value("${LAYOUT_PARSER_VERSION:}")
private String layoutParserVersion;
private final AppVersionPersistenceService appVersionPersistenceService;
private final TenantProvider tenantProvider;
@EventListener(ApplicationReadyEvent.class)
public void trackAppVersion() {
tenantProvider.getTenants()
.forEach(tenant -> {
if (!TenantUtils.isTenantReadyForPersistence(tenant)) {
return;
}
if (appVersion.isBlank() || layoutParserVersion.isBlank()) {
log.info("No app version or layout parser version was provided. Version tracking skipped. ");
return;
}
TenantContext.setTenantId(tenant.getTenantId());
try {
if (appVersionPersistenceService.insertIfNotExists(appVersion, layoutParserVersion)) {
log.info("Started with new app version {} / layout parser version {}.", appVersion, layoutParserVersion);
}
} catch (ConstraintViolationException cve) {
log.error("Validation failed for app version {} / layout parser version {}.", appVersion, layoutParserVersion, cve);
} catch (Exception e) {
log.error("Failed to track app version {} / layout parser version {}.", appVersion, layoutParserVersion, e);
}
});
}
}

View File

@ -1,170 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.migration;
import com.iqser.red.service.persistence.management.v1.processor.entity.annotations.*;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.CommentRepository;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.FileRepository;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.annotationentity.*;
import com.knecon.fforesight.databasetenantcommons.providers.utils.MagicConverter;
import jakarta.transaction.Transactional;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Service;
@Slf4j
@Service
@Transactional
@RequiredArgsConstructor
@FieldDefaults(makeFinal = true, level = AccessLevel.PRIVATE)
public class SaasAnnotationIdMigrationService {
ManualRedactionRepository manualRedactionRepository;
RemoveRedactionRepository removeRedactionRepository;
ForceRedactionRepository forceRedactionRepository;
ResizeRedactionRepository resizeRedactionRepository;
RecategorizationRepository recategorizationRepository;
LegalBasisChangeRepository legalBasisChangeRepository;
CommentRepository commentRepository;
FileRepository fileRepository;
public int updateManualAddRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = manualRedactionRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), ManualRedactionEntryEntity.class);
newEntry.setPositions(MagicConverter.convert(oldEntry.get().getPositions(), RectangleEntity.class));
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
newEntry.setId(newAnnotationEntityId);
manualRedactionRepository.deleteById(oldAnnotationEntityId);
manualRedactionRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateRemoveRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = removeRedactionRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), IdRemovalEntity.class);
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
newEntry.setId(newAnnotationEntityId);
removeRedactionRepository.deleteById(oldAnnotationEntityId);
removeRedactionRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateForceRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = forceRedactionRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), ManualForceRedactionEntity.class);
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
newEntry.setId(newAnnotationEntityId);
forceRedactionRepository.deleteById(oldAnnotationEntityId);
forceRedactionRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateResizeRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = resizeRedactionRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), ManualResizeRedactionEntity.class);
newEntry.setId(newAnnotationEntityId);
newEntry.setPositions(MagicConverter.convert(oldEntry.get().getPositions(), RectangleEntity.class));
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
resizeRedactionRepository.deleteById(oldAnnotationEntityId);
resizeRedactionRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateRecategorizationRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = recategorizationRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), ManualRecategorizationEntity.class);
newEntry.setId(newAnnotationEntityId);
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
recategorizationRepository.deleteById(oldAnnotationEntityId);
recategorizationRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateLegalBasisChangeRedaction(AnnotationEntityId oldAnnotationEntityId, AnnotationEntityId newAnnotationEntityId) {
if (oldAnnotationEntityId.equals(newAnnotationEntityId)) {
return 0;
}
var oldEntry = legalBasisChangeRepository.findById(oldAnnotationEntityId);
if (oldEntry.isPresent()) {
var newEntry = MagicConverter.convert(oldEntry.get(), ManualLegalBasisChangeEntity.class);
newEntry.setId(newAnnotationEntityId);
newEntry.setFileStatus(fileRepository.findById(oldAnnotationEntityId.getFileId())
.get());
legalBasisChangeRepository.deleteById(oldAnnotationEntityId);
legalBasisChangeRepository.save(newEntry);
return 1;
}
return 0;
}
public int updateCommentIds(String fileId, String key, String value) {
if (key.equals(value)) {
return 0;
}
return commentRepository.saasMigrationUpdateAnnotationIds(fileId, key, value);
}
}

View File

@ -1,83 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.migration;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import org.springframework.stereotype.Service;
import com.google.common.hash.HashFunction;
import com.google.common.hash.Hashing;
import com.iqser.red.service.persistence.management.v1.processor.entity.annotations.AnnotationEntityId;
import com.iqser.red.service.persistence.management.v1.processor.entity.annotations.ManualRedactionEntryEntity;
import com.iqser.red.service.persistence.management.v1.processor.model.ManualChangesQueryOptions;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.annotations.AddRedactionPersistenceService;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.type.DictionaryEntryType;
import lombok.RequiredArgsConstructor;
@Service
@RequiredArgsConstructor
public class SaasMigrationManualChangesUpdateService {
private final AddRedactionPersistenceService addRedactionPersistenceService;
private final HashFunction hashFunction = Hashing.murmur3_128();
public void convertUnprocessedAddToDictionariesToLocalChanges(String fileId) {
var unprocessedManualAdds = addRedactionPersistenceService.findEntriesByFileIdAndOptions(fileId, ManualChangesQueryOptions.unprocessedOnly());
for (var unprocessedManualAdd : unprocessedManualAdds) {
if (!unprocessedManualAdd.getDictionaryEntryType().equals(DictionaryEntryType.ENTRY)) {
continue;
}
if (unprocessedManualAdd.isAddToDictionary() || unprocessedManualAdd.isAddToAllDossiers()) {
// copy pending dict change to a new one with a different id. Can't reuse the same one, as it's the primary key of the table.
// It has no functionality, its only there, such that the UI can show a pending change.
ManualRedactionEntryEntity pendingDictAdd = new ManualRedactionEntryEntity(buildSecondaryId(unprocessedManualAdd.getId(), fileId),
unprocessedManualAdd.getUser(),
unprocessedManualAdd.getTypeId(),
unprocessedManualAdd.getValue(),
unprocessedManualAdd.getReason(),
unprocessedManualAdd.getLegalBasis(),
unprocessedManualAdd.getSection(),
unprocessedManualAdd.isRectangle(),
unprocessedManualAdd.isAddToDictionary(),
unprocessedManualAdd.isAddToAllDossiers(),
unprocessedManualAdd.isAddToDossierDictionary(),
DictionaryEntryType.ENTRY,
unprocessedManualAdd.getRequestDate(),
null,
null,
new ArrayList<>(unprocessedManualAdd.getPositions()),
unprocessedManualAdd.getFileStatus(),
unprocessedManualAdd.getTextBefore(),
unprocessedManualAdd.getTextAfter(),
unprocessedManualAdd.getSourceId(),
new HashSet<>(unprocessedManualAdd.getTypeIdsOfModifiedDictionaries()));
addRedactionPersistenceService.update(pendingDictAdd);
// change existing dict add to unprocessed manual add. ID must match with prior entry, such that other unprocessed manual changes may be applied to it.
unprocessedManualAdd.setAddToDictionary(false);
unprocessedManualAdd.setAddToAllDossiers(false);
unprocessedManualAdd.setLegalBasis("");
unprocessedManualAdd.setTypeIdsOfModifiedDictionaries(Collections.emptySet());
unprocessedManualAdd.setDictionaryEntryType(null);
addRedactionPersistenceService.update(unprocessedManualAdd);
}
}
}
private AnnotationEntityId buildSecondaryId(AnnotationEntityId annotationEntityId, String fileId) {
return new AnnotationEntityId(hashFunction.hashString(annotationEntityId.getAnnotationId(), StandardCharsets.UTF_8).toString(), fileId);
}
}

View File

@ -1,396 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.migration;
import static com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration.MIGRATION_REQUEST_QUEUE;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.atomic.AtomicInteger;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.stereotype.Service;
import com.iqser.red.service.persistence.management.v1.processor.entity.annotations.AnnotationEntityId;
import com.iqser.red.service.persistence.management.v1.processor.exception.InternalServerErrorException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.model.ManualChangesQueryOptions;
import com.iqser.red.service.persistence.management.v1.processor.service.CommentService;
import com.iqser.red.service.persistence.management.v1.processor.service.DossierService;
import com.iqser.red.service.persistence.management.v1.processor.service.IndexingService;
import com.iqser.red.service.persistence.management.v1.processor.service.job.AutomaticAnalysisJob;
import com.iqser.red.service.persistence.management.v1.processor.service.layoutparsing.LayoutParsingRequestFactory;
import com.iqser.red.service.persistence.management.v1.processor.service.manualredactions.ManualRedactionProviderService;
import com.iqser.red.service.persistence.management.v1.processor.service.manualredactions.ManualRedactionService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.FileStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.SaasMigrationStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.settings.FileManagementServiceSettings;
import com.iqser.red.service.persistence.management.v1.processor.utils.StorageIdUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.migration.MigratedIds;
import com.iqser.red.service.persistence.service.v1.api.shared.model.annotations.entitymapped.ManualRedactionEntry;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileType;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.WorkflowStatus;
import com.iqser.red.service.redaction.v1.model.MigrationRequest;
import com.iqser.red.storage.commons.exception.StorageException;
import com.iqser.red.storage.commons.exception.StorageObjectDoesNotExist;
import com.iqser.red.storage.commons.service.StorageService;
import com.knecon.fforesight.databasetenantcommons.providers.TenantSyncService;
import com.knecon.fforesight.service.layoutparser.internal.api.queue.LayoutParsingQueueNames;
import com.knecon.fforesight.tenantcommons.TenantContext;
import com.knecon.fforesight.tenantcommons.TenantProvider;
import com.knecon.fforesight.tenantcommons.model.TenantSyncEvent;
import com.knecon.fforesight.tenantcommons.model.UpdateDetailsRequest;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import lombok.extern.slf4j.Slf4j;
@Slf4j
@Service
@RequiredArgsConstructor
@FieldDefaults(makeFinal = true, level = AccessLevel.PRIVATE)
public class SaasMigrationService implements TenantSyncService {
AutomaticAnalysisJob automaticAnalysisJob;
FileStatusPersistenceService fileStatusPersistenceService;
SaasMigrationStatusPersistenceService saasMigrationStatusPersistenceService;
DossierService dossierService;
ManualRedactionProviderService manualRedactionProviderService;
TenantProvider tenantProvider;
IndexingService indexingService;
LayoutParsingRequestFactory layoutParsingRequestFactory;
RabbitTemplate rabbitTemplate;
FileManagementServiceSettings settings;
StorageService storageService;
SaasAnnotationIdMigrationService saasAnnotationIdMigrationService;
UncompressedFilesMigrationService uncompressedFilesMigrationService;
ManualRedactionService manualRedactionService;
CommentService commentService;
RankDeDuplicationService rankDeDuplicationService;
SaasMigrationManualChangesUpdateService saasMigrationManualChangesUpdateService;
@Override
public synchronized void syncTenant(TenantSyncEvent tenantSyncEvent) {
startMigrationForTenant(tenantSyncEvent.getTenantId());
}
// Persistence-Service needs to be scaled to 1.
public void startMigrationForTenant(String tenantId) {
// TODO migrate rules.
automaticAnalysisJob.stopForTenant(tenantId);
log.info("Starting uncompressed files migration ...");
uncompressedFilesMigrationService.migrateUncompressedFiles(tenantId);
log.info("Finished uncompressed files migration ...");
rankDeDuplicationService.deduplicate();
int numberOfFiles = 0;
var files = saasMigrationStatusPersistenceService.findAll();
for (var file : files) {
var dossier = dossierService.getDossierById(file.getDossierId());
if (dossier.getHardDeletedTime() != null) {
if (fileStatusPersistenceService.getStatus(file.getFileId()).getHardDeletedTime() != null) {
saasMigrationStatusPersistenceService.updateStatus(file.getFileId(), SaasMigrationStatus.FINISHED);
continue;
} else {
fileStatusPersistenceService.hardDelete(file.getFileId(), dossier.getHardDeletedTime());
saasMigrationStatusPersistenceService.updateStatus(file.getFileId(), SaasMigrationStatus.FINISHED);
continue;
}
}
if (fileStatusPersistenceService.getStatus(file.getFileId()).getHardDeletedTime() != null) {
saasMigrationStatusPersistenceService.updateStatus(file.getFileId(), SaasMigrationStatus.FINISHED);
continue;
}
if (!file.getStatus().equals(SaasMigrationStatus.MIGRATION_REQUIRED)) {
log.info("Skipping {} for tenant {} since migration status is {}", file.getFileId(), TenantContext.getTenantId(), file.getStatus());
continue;
}
// delete NER_ENTITIES since offsets depend on old document structure.
storageService.deleteObject(TenantContext.getTenantId(), StorageIdUtils.getStorageId(file.getDossierId(), file.getFileId(), FileType.NER_ENTITIES));
var layoutParsingRequest = layoutParsingRequestFactory.build(dossier.getDossierTemplate().getId(), file.getDossierId(), file.getFileId(), false);
rabbitTemplate.convertAndSend(LayoutParsingQueueNames.LAYOUT_PARSING_REQUEST_EXCHANGE, TenantContext.getTenantId(), layoutParsingRequest);
numberOfFiles++;
}
log.info("Added {} documents for tenant {} to Layout-Parsing queue for saas migration", numberOfFiles, TenantContext.getTenantId());
if (numberOfFiles == 0) {
finalizeMigration();
}
}
public void startMigrationForFile(String dossierId, String fileId) {
var dossier = dossierService.getDossierById(dossierId);
if (dossier.getHardDeletedTime() != null) {
if (fileStatusPersistenceService.getStatus(fileId).getHardDeletedTime() != null) {
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.FINISHED);
return;
} else {
fileStatusPersistenceService.hardDelete(fileId, dossier.getHardDeletedTime());
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.FINISHED);
return;
}
}
if (fileStatusPersistenceService.getStatus(fileId).getHardDeletedTime() != null) {
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.FINISHED);
return;
}
log.info("Starting Migration for dossierId {} and fileId {}", dossierId, fileId);
saasMigrationStatusPersistenceService.createMigrationRequiredStatus(dossierId, fileId);
var layoutParsingRequest = layoutParsingRequestFactory.build(dossier.getDossierTemplate().getId(), dossierId, fileId, false);
rabbitTemplate.convertAndSend(LayoutParsingQueueNames.LAYOUT_PARSING_REQUEST_EXCHANGE, TenantContext.getTenantId(), layoutParsingRequest);
}
public void handleLayoutParsingFinished(String dossierId, String fileId) {
if (!layoutParsingFilesExist(dossierId, fileId)) {
saasMigrationStatusPersistenceService.updateErrorStatus(fileId, "Layout parsing files not written!");
return;
}
log.info("Layout Parsing finished for saas migration for tenant {} dossier {} and file {}", TenantContext.getTenantId(), dossierId, fileId);
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.DOCUMENT_FILES_MIGRATED);
if (fileStatusPersistenceService.getStatus(fileId).getWorkflowStatus().equals(WorkflowStatus.APPROVED)) {
saasMigrationManualChangesUpdateService.convertUnprocessedAddToDictionariesToLocalChanges(fileId);
}
try {
indexingService.reindex(dossierId, Set.of(fileId), false);
String dossierTemplateId = dossierService.getDossierById(dossierId).getDossierTemplateId();
rabbitTemplate.convertAndSend(MIGRATION_REQUEST_QUEUE,
MigrationRequest.builder()
.dossierTemplateId(dossierTemplateId)
.dossierId(dossierId)
.fileId(fileId)
.fileIsApproved(fileStatusPersistenceService.getStatus(fileId).getWorkflowStatus().equals(WorkflowStatus.APPROVED))
.manualRedactions(manualRedactionProviderService.getManualRedactions(fileId, ManualChangesQueryOptions.allWithoutDeleted()))
.entitiesWithComments(commentService.getCommentCounts(fileId).keySet())
.build());
} catch (Exception e) {
log.error("Queuing of entityLog migration failed with {}", e.getMessage());
saasMigrationStatusPersistenceService.updateErrorStatus(fileId, String.format("Queuing of entityLog migration failed with %s", e.getMessage()));
}
}
private boolean layoutParsingFilesExist(String dossierId, String fileId) {
return storageService.objectExists(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_STRUCTURE)) //
&& storageService.objectExists(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_TEXT)) //
&& storageService.objectExists(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_PAGES)) //
&& storageService.objectExists(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_POSITION));
}
public void handleEntityLogMigrationFinished(String dossierId, String fileId) {
if (!entityLogMigrationFilesExist(dossierId, fileId)) {
saasMigrationStatusPersistenceService.updateErrorStatus(fileId, "Migration Files not written!");
return;
}
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.REDACTION_LOGS_MIGRATED);
log.info("EntityLog migration finished for saas migration for tenant {} dossier {} and file {}", TenantContext.getTenantId(), dossierId, fileId);
migrateAnnotationIdsAndAddManualAddRedactionsAndDeleteSectionGrid(dossierId, fileId);
}
private boolean entityLogMigrationFilesExist(String dossierId, String fileId) {
return storageService.objectExists(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.ENTITY_LOG)) && storageService.objectExists(
TenantContext.getTenantId(),
StorageIdUtils.getStorageId(dossierId, fileId, FileType.MIGRATED_IDS));
}
public void handleError(String dossierId, String fileId, String errorCause, String retryExchange) {
var migrationEntry = saasMigrationStatusPersistenceService.findById(fileId);
Integer numErrors = migrationEntry.getProcessingErrorCounter();
if (numErrors != null && numErrors <= settings.getMaxErrorRetries()) {
saasMigrationStatusPersistenceService.updateErrorCounter(fileId, numErrors + 1, errorCause);
rabbitTemplate.convertAndSend(retryExchange, TenantContext.getTenantId(), MigrationRequest.builder().dossierId(dossierId).fileId(fileId).build());
log.error("Retrying error during saas migration for tenant {} dossier {} and file {}, cause {}", TenantContext.getTenantId(), dossierId, fileId, errorCause);
} else {
saasMigrationStatusPersistenceService.updateErrorStatus(fileId, errorCause);
log.error("Error during saas migration for tenant {} dossier {} and file {}, cause {}", TenantContext.getTenantId(), dossierId, fileId, errorCause);
}
}
public void requeueErrorFiles() {
automaticAnalysisJob.stopForTenant(TenantContext.getTenantId());
saasMigrationStatusPersistenceService.findAllByStatus(SaasMigrationStatus.ERROR)
.forEach(migrationStatus -> startMigrationForFile(migrationStatus.getDossierId(), migrationStatus.getFileId()));
}
private void migrateAnnotationIdsAndAddManualAddRedactionsAndDeleteSectionGrid(String dossierId, String fileId) {
MigratedIds migratedIds = getMigratedIds(dossierId, fileId);
Map<String, String> oldToNewMapping = migratedIds.buildOldToNewMapping();
updateAnnotationIds(dossierId, fileId, oldToNewMapping);
List<String> forceRedactionIdsToDelete = migratedIds.getForceRedactionIdsToDelete();
softDeleteForceRedactions(fileId, forceRedactionIdsToDelete);
log.info("Soft-deleted force redactions.");
List<ManualRedactionEntry> manualRedactionEntriesToAdd = migratedIds.getManualRedactionEntriesToAdd();
int count = addManualRedactionEntries(manualRedactionEntriesToAdd);
log.info("Added {} additional manual entries.", count);
deleteSectionGridAndNerEntitiesFiles(dossierId, fileId);
saasMigrationStatusPersistenceService.updateStatus(fileId, SaasMigrationStatus.FINISHED);
log.info("AnnotationIds migration finished for saas migration for tenant {} dossier {} and file {}", TenantContext.getTenantId(), dossierId, fileId);
finalizeMigration(); // AutomaticAnalysisJob should be re-enabled by re-starting the persistence service pod after a rule change
}
private void deleteSectionGridAndNerEntitiesFiles(String dossierId, String fileId) {
try {
storageService.deleteObject(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.SECTION_GRID));
} catch (StorageObjectDoesNotExist e) {
log.info("No sectiongrid found for {}, {}, ignoring....", dossierId, fileId);
}
try {
storageService.deleteObject(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.NER_ENTITIES));
} catch (StorageObjectDoesNotExist e) {
log.info("No ner entities file found for {}, {}, ignoring....", dossierId, fileId);
}
}
private void softDeleteForceRedactions(String fileId, List<String> forceRedactionIdsToDelete) {
manualRedactionService.softDeleteForceRedactions(fileId, forceRedactionIdsToDelete);
}
private int addManualRedactionEntries(List<ManualRedactionEntry> manualRedactionEntriesToAdd) {
manualRedactionEntriesToAdd.forEach(add -> {
if (add.getSection() != null && add.getSection().length() > 254) {
add.setSection(add.getSection().substring(0, 254));
}
});
return manualRedactionService.addManualRedactionEntries(manualRedactionEntriesToAdd, true);
}
public void revertMigrationForFile(String dossierId, String fileId) {
log.info("Reverting Migration for dossierId {} and fileId {}", dossierId, fileId);
MigratedIds migratedIds = getMigratedIds(dossierId, fileId);
Map<String, String> newToOldMapping = migratedIds.buildNewToOldMapping();
updateAnnotationIds(dossierId, fileId, newToOldMapping);
deleteManualRedactionEntries(migratedIds.getManualRedactionEntriesToAdd());
undeleteForceRedactions(fileId, migratedIds.getForceRedactionIdsToDelete());
saasMigrationStatusPersistenceService.createMigrationRequiredStatus(dossierId, fileId);
}
private void undeleteForceRedactions(String fileId, List<String> forceRedactionIdsToDelete) {
manualRedactionService.undeleteForceRedactions(fileId, forceRedactionIdsToDelete);
}
private void deleteManualRedactionEntries(List<ManualRedactionEntry> manualRedactionEntriesToAdd) {
manualRedactionService.deleteManualRedactionEntries(manualRedactionEntriesToAdd);
}
private void updateAnnotationIds(String dossierId, String fileId, Map<String, String> idMapping) {
try {
updateAnnotationIds(fileId, idMapping);
} catch (Exception e) {
String message = String.format("Error during annotation id migration for tenant %s dossier %s and file %s, cause %s",
TenantContext.getTenantId(),
dossierId,
fileId,
e.getMessage());
saasMigrationStatusPersistenceService.updateErrorStatus(fileId, message);
log.error(message);
throw e;
}
}
private void finalizeMigration() {
if (saasMigrationStatusPersistenceService.countByStatus(SaasMigrationStatus.FINISHED) == saasMigrationStatusPersistenceService.countAll()) {
// automaticAnalysisJob.startForTenant(TenantContext.getTenantId()); // AutomaticAnalysisJob should be re-enabled by re-starting the persistence service pod after a rule change
tenantProvider.updateDetails(TenantContext.getTenantId(), UpdateDetailsRequest.builder().key("persistence-service-ready").value(true).build());
log.info("Saas migration finished for tenantId {}, re-enabled scheduler", TenantContext.getTenantId());
}
}
public void updateAnnotationIds(String fileId, Map<String, String> idMapping) {
AtomicInteger numUpdates = new AtomicInteger(0);
AtomicInteger numCommentUpdates = new AtomicInteger(0);
idMapping.forEach((key, value) -> {
AnnotationEntityId oldAnnotationEntityId = buildAnnotationId(fileId, key);
AnnotationEntityId newAnnotationEntityId = buildAnnotationId(fileId, value);
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateManualAddRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateRemoveRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateForceRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateResizeRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateRecategorizationRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numUpdates.getAndAdd(saasAnnotationIdMigrationService.updateLegalBasisChangeRedaction(oldAnnotationEntityId, newAnnotationEntityId));
numCommentUpdates.getAndAdd(saasAnnotationIdMigrationService.updateCommentIds(fileId, key, value));
});
log.info("Migrated {} annotationIds and {} comments for file {}", numUpdates.get(), numCommentUpdates, fileId);
}
private AnnotationEntityId buildAnnotationId(String fileId, String annotationId) {
return new AnnotationEntityId(annotationId, fileId);
}
private MigratedIds getMigratedIds(String dossierId, String fileId) {
try {
return storageService.readJSONObject(TenantContext.getTenantId(), StorageIdUtils.getStorageId(dossierId, fileId, FileType.MIGRATED_IDS), MigratedIds.class);
} catch (StorageObjectDoesNotExist e) {
throw new NotFoundException(String.format("MigratedIds does not exist for Dossier ID \"%s\" and File ID \"%s\"!", dossierId, fileId));
} catch (StorageException e) {
throw new InternalServerErrorException(e.getMessage());
}
}
}

View File

@ -0,0 +1,35 @@
package com.iqser.red.service.persistence.management.v1.processor.migration.migrations;
import org.springframework.stereotype.Service;
import com.iqser.red.service.persistence.management.v1.processor.migration.Migration;
import com.iqser.red.service.persistence.management.v1.processor.migration.RankDeDuplicationService;
import lombok.extern.slf4j.Slf4j;
@Slf4j
@Service
public class V32RankDeduplicationMigration extends Migration {
private static final String NAME = "Adding to the migration the rank de-duplication";
private static final long VERSION = 32;
private final RankDeDuplicationService rankDeDuplicationService;
public V32RankDeduplicationMigration(RankDeDuplicationService rankDeDuplicationService) {
super(NAME, VERSION);
this.rankDeDuplicationService = rankDeDuplicationService;
}
@Override
protected void migrate() {
log.info("Migration: Checking for duplicate ranks");
rankDeDuplicationService.deduplicate();
}
}

View File

@ -25,8 +25,7 @@ public final class ApplicationRoles {
UPDATE_LICENSE,
GET_TENANTS,
CREATE_TENANT,
READ_USERS,
READ_ALL_USERS,
READ_USERS, READ_ALL_USERS, READ_USER_STATS,
WRITE_USERS,
READ_SMTP_CONFIGURATION,
WRITE_SMTP_CONFIGURATION,
@ -62,8 +61,7 @@ public final class ApplicationRoles {
PROCESS_MANUAL_REDACTION_REQUEST,
READ_COLORS,
READ_DICTIONARY_TYPES,
READ_DIGITAL_SIGNATURE,
READ_DOSSIER,
READ_DIGITAL_SIGNATURE, READ_DOSSIER,
READ_DOSSIER_ATTRIBUTES,
READ_DOSSIER_ATTRIBUTES_CONFIG,
READ_DOSSIER_TEMPLATES,
@ -118,8 +116,7 @@ public final class ApplicationRoles {
READ_DOSSIER_TEMPLATES,
READ_FILE_ATTRIBUTES_CONFIG,
READ_LEGAL_BASIS,
READ_LICENSE_REPORT,
READ_NOTIFICATIONS,
READ_LICENSE_REPORT, READ_NOTIFICATIONS, READ_USER_STATS,
READ_RULES,
READ_DATA_FORMATS,
READ_SMTP_CONFIGURATION,
@ -153,9 +150,8 @@ public final class ApplicationRoles {
READ_APP_CONFIG,
READ_GENERAL_CONFIGURATION,
READ_GENERAL_CONFIGURATION,
GET_SIMILAR_IMAGES,
READ_NOTIFICATIONS,
READ_USERS,
GET_SIMILAR_IMAGES, READ_NOTIFICATIONS,
READ_USERS, READ_USER_STATS,
UPDATE_MY_PROFILE,
UPDATE_NOTIFICATIONS,
WRITE_USERS,

View File

@ -1,6 +1,9 @@
package com.iqser.red.service.persistence.management.v1.processor.service;
import java.util.Set;
import org.springframework.http.HttpStatus;
import org.springframework.security.access.AccessDeniedException;
import org.springframework.security.access.prepost.PostAuthorize;
import org.springframework.security.acls.AclPermissionEvaluator;
import org.springframework.security.core.context.SecurityContextHolder;
@ -203,4 +206,19 @@ public class AccessControlService {
}
}
public void verifyUserIsDossierOwnerOrApproverOrAssignedReviewer(String dossierId, Set<String> fileIds) {
try {
verifyUserIsDossierOwnerOrApprover(dossierId);
} catch (AccessDeniedException e1) {
try {
for (String fileId : fileIds) {
verifyUserIsReviewer(dossierId, fileId);
}
} catch (NotAllowedException e2) {
throw new NotAllowedException("User must be dossier owner, approver or assigned reviewer of the file.");
}
}
}
}

View File

@ -24,6 +24,7 @@ import com.iqser.red.service.persistence.management.v1.processor.mapper.Componen
import com.iqser.red.service.persistence.management.v1.processor.model.ComponentMapping;
import com.iqser.red.service.persistence.management.v1.processor.model.ComponentMappingDownloadModel;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierTemplatePersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.StringEncodingUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.ComponentMappingMetadata;
import com.opencsv.CSVParserBuilder;
import com.opencsv.CSVReader;
@ -107,7 +108,7 @@ public class ComponentMappingService {
String fileName,
char quoteChar) {
Charset charset = resolveCharset(encoding);
Charset charset = StringEncodingUtils.resolveCharset(encoding);
CsvStats stats = sortCSVFile(delimiter, mappingFile, charset, quoteChar);
@ -126,20 +127,6 @@ public class ComponentMappingService {
}
private static Charset resolveCharset(String encoding) {
try {
return Charset.forName(encoding);
} catch (IllegalCharsetNameException e) {
throw new BadRequestException("Invalid character encoding: " + encoding);
} catch (UnsupportedCharsetException e) {
throw new BadRequestException("Unsupported character encoding: " + encoding);
} catch (IllegalArgumentException e) {
throw new BadRequestException("Encoding can't be null.");
}
}
private static CsvStats sortCSVFile(char delimiter, File mappingFile, Charset charset, char quoteChar) throws BadRequestException, IOException {
Path tempFile = Files.createTempFile("mapping", ".tmp");

View File

@ -494,9 +494,9 @@ public class DictionaryService {
}
@PreAuthorize("hasAuthority('" + ADD_UPDATE_DICTIONARY_TYPE + "')")
public void changeAddToDictionary(String type, String dossierTemplateId, String dossierId, boolean addToDictionary) {
public void changeAddToDictionary(String type, String dossierTemplateId, String dossierId, boolean addToDictionaryAction) {
dictionaryPersistenceService.updateAddToDictionary(toTypeId(type, dossierTemplateId, dossierId), addToDictionary);
dictionaryPersistenceService.updateAddToDictionary(toTypeId(type, dossierTemplateId, dossierId), addToDictionaryAction);
}

View File

@ -156,7 +156,7 @@ public class DossierTemplateCloneService {
private void cloneComponents(String dossierTemplate, String clonedDossierTemplateId) {
List<ComponentDefinitionEntity> componentDefinitionEntities = componentDefinitionPersistenceService.findComponentsByDossierTemplateId(dossierTemplate);
List<ComponentDefinitionEntity> componentDefinitionEntities = componentDefinitionPersistenceService.findByDossierTemplateIdAndNotSoftDeleted(dossierTemplate);
for (ComponentDefinitionEntity componentDefinitionEntity : componentDefinitionEntities) {
ComponentDefinitionAddRequest componentDefinitionAddRequest = ComponentDefinitionAddRequest.builder()

View File

@ -1,5 +1,7 @@
package com.iqser.red.service.persistence.management.v1.processor.service;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ASCII_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ISO_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileAttributeTypeFormats.FILE_ATTRIBUTE_TYPE_DATE_FORMAT;
import static com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileAttributeTypeFormats.FILE_ATTRIBUTE_TYPE_NUMBER_REGEX;
@ -32,6 +34,7 @@ import com.iqser.red.service.persistence.management.v1.processor.exception.Confl
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.FileAttributeConfigPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.FileStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.StringEncodingUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.ImportCsvRequest;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.ImportCsvResponse;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileAttributeType;
@ -59,10 +62,6 @@ public class FileAttributesManagementService {
private final DossierPersistenceService dossierPersistenceService;
private final IndexingService indexingService;
public static String UTF_ENCODING = "UTF-8";
public static String ASCII_ENCODING = "ASCII";
public static String ISO_ENCODING = "ISO";
@Transactional
public ImportCsvResponse importCsv(String dossierId, ImportCsvRequest importCsvRequest) {
@ -144,7 +143,7 @@ public class FileAttributesManagementService {
throw new IllegalArgumentException("Delimiter must be a single character.");
}
char delimiterChar = delimiter.charAt(0);
Charset charset = Charset.forName(encoding);
Charset charset = StringEncodingUtils.resolveCharset(encoding);
try (CSVReader csvReader = new CSVReaderBuilder(new InputStreamReader(new ByteArrayInputStream(csvFileBytes), charset)).withCSVParser(new CSVParserBuilder().withSeparator(
delimiterChar).build()).build()) {
@ -214,7 +213,7 @@ public class FileAttributesManagementService {
if (ASCII_ENCODING.equalsIgnoreCase(encoding) || StandardCharsets.US_ASCII.name().equalsIgnoreCase(encoding)) {
return StandardCharsets.US_ASCII;
}
// accept both "ISO" (non-unique name) and the actual name "US-ASCII" of the charset
// accept only name "ISO_8859_1" of the charset
if (ISO_ENCODING.equalsIgnoreCase(encoding) || StandardCharsets.ISO_8859_1.name().equalsIgnoreCase(encoding)) {
return StandardCharsets.ISO_8859_1;
}

View File

@ -1,6 +1,5 @@
package com.iqser.red.service.persistence.management.v1.processor.service;
import static com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructureProto.DocumentStructure;
import java.io.BufferedInputStream;
import java.io.File;
@ -18,15 +17,15 @@ import com.iqser.red.service.persistence.management.v1.processor.utils.StorageId
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.componentlog.ComponentLog;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.entitylog.EntityLog;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.entitylog.imported.ImportedLegalBases;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.entitylog.imported.ImportedRedactions;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.entitylog.imported.ImportedRedactionsPerPage;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileType;
import com.iqser.red.service.persistence.service.v1.api.shared.model.redactionlog.section.SectionGrid;
import com.iqser.red.service.persistence.service.v1.api.shared.mongo.service.ComponentLogMongoService;
import com.iqser.red.service.persistence.service.v1.api.shared.mongo.service.EntityLogMongoService;
import com.iqser.red.service.redaction.v1.server.data.DocumentStructureProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentStructureWrapper;
import com.iqser.red.storage.commons.exception.StorageObjectDoesNotExist;
import com.iqser.red.storage.commons.service.StorageService;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructureWrapper;
import com.knecon.fforesight.tenantcommons.TenantContext;
import lombok.RequiredArgsConstructor;
@ -321,7 +320,7 @@ public class FileManagementStorageService {
return new DocumentStructureWrapper(storageService.readProtoObject(TenantContext.getTenantId(),
StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_STRUCTURE),
DocumentStructure.parser()));
DocumentStructureProto.DocumentStructure.parser()));
}
}

View File

@ -125,7 +125,7 @@ public class FileStatusManagementService {
if (userId != null) {
assignee = userId;
}
fileStatusService.setStatusSuccessful(fileId, assignee != null ? WorkflowStatus.UNDER_REVIEW : WorkflowStatus.NEW);
fileStatusService.setStatusSuccessful(fileId, assignee != null ? WorkflowStatus.UNDER_REVIEW : WorkflowStatus.NEW, fileStatus.getWorkflowStatus());
fileStatusService.setAssignee(fileId, assignee);
indexingService.addToIndexingQueue(IndexMessageType.UPDATE, null, dossierId, fileId, 2);
}
@ -139,7 +139,7 @@ public class FileStatusManagementService {
assignee = approverId;
}
fileStatusService.setStatusSuccessful(fileId, assignee != null ? WorkflowStatus.UNDER_APPROVAL : WorkflowStatus.NEW);
fileStatusService.setStatusSuccessful(fileId, assignee != null ? WorkflowStatus.UNDER_APPROVAL : WorkflowStatus.NEW, fileStatus.getWorkflowStatus());
fileStatusService.setAssignee(fileId, approverId);
indexingService.addToIndexingQueue(IndexMessageType.UPDATE, null, dossierId, fileId, 2);
}
@ -169,7 +169,7 @@ public class FileStatusManagementService {
throw new BadRequestException("Allowed transition not possible from: " + fileStatus.getWorkflowStatus() + " to status NEW");
}
fileStatusService.setAssignee(fileId, null);
fileStatusService.setStatusSuccessful(fileId, WorkflowStatus.NEW);
fileStatusService.setStatusSuccessful(fileId, WorkflowStatus.NEW, fileStatus.getWorkflowStatus());
}

View File

@ -50,6 +50,7 @@ public class FileStatusMapper {
.legalBasisVersion(status.getLegalBasisVersion())
.lastProcessed(status.getLastProcessed())
.lastLayoutProcessed(status.getLastLayoutProcessed())
.layoutParserVersion(status.getLayoutParserVersion())
.approvalDate(status.getApprovalDate())
.lastUploaded(status.getLastUploaded())
.analysisDuration(status.getAnalysisDuration())
@ -69,6 +70,7 @@ public class FileStatusMapper {
.fileSize(status.getFileSize())
.fileErrorInfo(status.getFileErrorInfo())
.componentMappingVersions(status.getComponentMappingVersions())
.lastDownload(status.getLastDownloadDate())
.build();
}

View File

@ -122,7 +122,7 @@ public class FileStatusProcessingUpdateService {
} else {
fileStatusService.setStatusOcrProcessing(fileId,
fileEntity.getProcessingStatus().equals(ProcessingStatus.OCR_PROCESSING) ? fileEntity.getProcessingErrorCounter() + 1 : 0);
fileStatusService.addToOcrQueue(dossierId, fileId, 2);
fileStatusService.addToOcrQueue(dossierId, fileId, 2, false);
}
}

View File

@ -316,7 +316,7 @@ public class FileStatusService {
}
log.info("Add file: {} from dossier {} to OCR queue", fileId, dossierId);
setStatusOcrQueued(dossierId, fileId);
setStatusOcrQueued(dossierId, fileId, false);
sendReadOnlyAnalysisEvent(dossierId, fileId, fileEntity);
return;
}
@ -352,7 +352,6 @@ public class FileStatusService {
return;
}
boolean forceAnalysis = false;
if (settings.isLlmNerServiceEnabled()) {
boolean objectExists = fileManagementStorageService.objectExists(dossierId, fileId, FileType.LLM_NER_ENTITIES);
@ -386,7 +385,7 @@ public class FileStatusService {
boolean reanalyse = fileModel.isReanalysisRequired() || analysisType.equals(AnalysisType.MANUAL_REDACTION_REANALYZE);
MessageType messageType = calculateMessageType(reanalyse, fileModel.getProcessingStatus(), fileModel);
if(analysisType == AnalysisType.FORCE_ANALYSE || forceAnalysis) {
if (analysisType == AnalysisType.FORCE_ANALYSE || forceAnalysis) {
messageType = MessageType.ANALYSE;
}
@ -567,7 +566,7 @@ public class FileStatusService {
}
public void setStatusOcrQueued(String dossierId, String fileId) {
public void setStatusOcrQueued(String dossierId, String fileId, boolean allPages) {
FileEntity fileStatus = fileStatusPersistenceService.getStatus(fileId);
@ -579,7 +578,7 @@ public class FileStatusService {
updateOCRStartTime(fileId);
fileStatusPersistenceService.updateProcessingStatus(fileId, ProcessingStatus.OCR_PROCESSING_QUEUED);
websocketService.sendAnalysisEvent(dossierId, fileId, AnalyseStatus.OCR_PROCESSING, fileStatus.getNumberOfAnalyses() + 1);
addToOcrQueue(dossierId, fileId, 2);
addToOcrQueue(dossierId, fileId, 2, allPages);
}
@ -760,13 +759,16 @@ public class FileStatusService {
}
public void addToOcrQueue(String dossierId, String fileId, int priority) {
public void addToOcrQueue(String dossierId, String fileId, int priority, boolean allPages) {
var removeWatermark = dossierTemplatePersistenceService.getDossierTemplate(dossierPersistenceService.getDossierTemplateId(dossierId)).isRemoveWatermark();
Set<AzureOcrFeature> features = new HashSet<>();
if (removeWatermark) {
features.add(AzureOcrFeature.REMOVE_WATERMARKS);
}
if (allPages) {
features.add(AzureOcrFeature.ALL_PAGES);
}
if (currentApplicationTypeProvider.isDocuMine()) {
features.add(AzureOcrFeature.ROTATION_CORRECTION);
features.add(AzureOcrFeature.FONT_STYLE_DETECTION);
@ -816,9 +818,13 @@ public class FileStatusService {
}
public void setStatusSuccessful(String fileId, WorkflowStatus workflowStatus) {
public void setStatusSuccessful(String fileId, WorkflowStatus newWorkflowStatus, WorkflowStatus oldWorkflowStatus) {
fileStatusPersistenceService.updateWorkflowStatus(fileId, workflowStatus, false);
fileStatusPersistenceService.updateWorkflowStatus(fileId, newWorkflowStatus, false);
if (oldWorkflowStatus == WorkflowStatus.APPROVED && newWorkflowStatus != WorkflowStatus.APPROVED) {
fileStatusPersistenceService.clearLastDownload(fileId);
}
}
@ -973,7 +979,7 @@ public class FileStatusService {
if (runOcr) {
fileStatusPersistenceService.resetOcrStartAndEndDate(fileId);
setStatusOcrQueued(dossierId, fileId);
setStatusOcrQueued(dossierId, fileId, false);
return;
}
@ -1060,6 +1066,7 @@ public class FileStatusService {
addToAnalysisQueue(dossierId, fileId, priority, Sets.newHashSet(), AnalysisType.DEFAULT);
}
@Transactional
public void setStatusForceAnalyse(String dossierId, String fileId, boolean priority) {
@ -1075,9 +1082,9 @@ public class FileStatusService {
}
public void updateLayoutProcessedTime(String fileId) {
public void updateLayoutParserVersionAndProcessedTime(String fileId, String version) {
fileStatusPersistenceService.updateLayoutProcessedTime(fileId);
fileStatusPersistenceService.updateLayoutParserVersionAndProcessedTime(fileId, version);
}

View File

@ -12,8 +12,8 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.iqser.red.service.persistence.service.v1.api.shared.model.utils.Scope;
import com.iqser.red.service.persistence.service.v1.api.shared.mongo.document.ImageDocument;
import com.iqser.red.service.persistence.service.v1.api.shared.mongo.service.ImageMongoService;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructureWrapper;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.NodeTypeProto.NodeType;
import com.iqser.red.service.redaction.v1.server.data.DocumentStructureWrapper;
import com.iqser.red.service.redaction.v1.server.data.NodeTypeProto;
import lombok.Getter;
import lombok.extern.slf4j.Slf4j;
@ -35,7 +35,7 @@ public class ImageSimilarityService {
DocumentStructureWrapper documentStructure = fileManagementStorageService.getDocumentStructure(dossierId, fileId);
List<ImageDocument> imageDocuments = new ArrayList<>();
documentStructure.streamAllEntries()
.filter(entry -> entry.getType().equals(NodeType.IMAGE))
.filter(entry -> entry.getType().equals(NodeTypeProto.NodeType.IMAGE))
.forEach(i -> {
Map<String, String> properties = i.getPropertiesMap();
ImageDocument imageDocument = new ImageDocument();

View File

@ -25,6 +25,7 @@ import com.iqser.red.service.persistence.management.v1.processor.service.persist
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.LegalBasisMappingPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.RulesPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.settings.FileManagementServiceSettings;
import com.iqser.red.service.persistence.service.v1.api.shared.model.RuleFileType;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.ComponentMappingMetadata;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.ErrorCode;
@ -50,6 +51,7 @@ public class ReanalysisRequiredStatusService {
DossierPersistenceService dossierPersistenceService;
LegalBasisMappingPersistenceService legalBasisMappingPersistenceService;
ComponentMappingService componentMappingService;
FileManagementServiceSettings settings;
public FileModel enhanceFileStatusWithAnalysisRequirements(FileModel fileModel) {
@ -65,7 +67,12 @@ public class ReanalysisRequiredStatusService {
Map<String, DossierEntity> dossierMap = new HashMap<>();
Map<String, Map<String, Integer>> componentMappingVersionMap = new HashMap<>();
fileModels.forEach(entry -> {
var analysisRequiredResult = computeAnalysisRequired(entry, ignoreProcessingStates, dossierTemplateVersionMap, dossierVersionMap, dossierMap, componentMappingVersionMap);
var analysisRequiredResult = computeAnalysisRequired(entry,
ignoreProcessingStates,
dossierTemplateVersionMap,
dossierVersionMap,
dossierMap,
componentMappingVersionMap);
entry.setReanalysisRequired(analysisRequiredResult.isReanalysisRequired());
entry.setFullAnalysisRequired(analysisRequiredResult.isFullAnalysisRequired());
entry.setComponentReanalysisRequired(analysisRequiredResult.isComponentReanalysisRequired());
@ -123,7 +130,7 @@ public class ReanalysisRequiredStatusService {
if (fileStatus.getLastFileAttributeChange() != null && fileStatus.getLastProcessed().isBefore(fileStatus.getLastFileAttributeChange())) {
return new AnalysisRequiredResult(true, false);
} else {
return requiresReanalysisBasedOnVersionDifference(fileStatus, dossier, dossierTemplateVersionMap, dossierVersionMap,componentMappingVersionMap);
return requiresReanalysisBasedOnVersionDifference(fileStatus, dossier, dossierTemplateVersionMap, dossierVersionMap, componentMappingVersionMap);
}
default:
return new AnalysisRequiredResult(false, false);
@ -134,14 +141,18 @@ public class ReanalysisRequiredStatusService {
}
private AnalysisRequiredResult computeAnalysisRequiredForErrorState(DossierEntity dossier,FileModel fileStatus, Map<String, Map<VersionType, Long>> dossierTemplateVersionMap){
private AnalysisRequiredResult computeAnalysisRequiredForErrorState(DossierEntity dossier,
FileModel fileStatus,
Map<String, Map<VersionType, Long>> dossierTemplateVersionMap) {
Map<VersionType, Long> dossierTemplateVersions = dossierTemplateVersionMap.computeIfAbsent(dossier.getDossierTemplateId(),
k -> buildVersionData(dossier.getDossierTemplateId()));
var rulesVersionMatches = fileStatus.getRulesVersion() == dossierTemplateVersions.getOrDefault(RULES, -1L);
var componentRulesVersionMatches = fileStatus.getComponentRulesVersion() == dossierTemplateVersions.getOrDefault(COMPONENT_RULES, -1L);
var isRuleExecutionTimeout = fileStatus.getFileErrorInfo() != null && fileStatus.getFileErrorInfo().getErrorCode() != null && fileStatus.getFileErrorInfo().getErrorCode().equals(
ErrorCode.RULES_EXECUTION_TIMEOUT);
var isRuleExecutionTimeout = fileStatus.getFileErrorInfo() != null && fileStatus.getFileErrorInfo().getErrorCode() != null && fileStatus.getFileErrorInfo()
.getErrorCode()
.equals(ErrorCode.RULES_EXECUTION_TIMEOUT);
var fullAnalysisRequired = (!rulesVersionMatches || !componentRulesVersionMatches) && !isRuleExecutionTimeout;
@ -153,7 +164,7 @@ public class ReanalysisRequiredStatusService {
DossierEntity dossier,
Map<String, Map<VersionType, Long>> dossierTemplateVersionMap,
Map<String, Long> dossierVersionMap,
Map<String, Map<String,Integer>> componentMappingVersionMap) {
Map<String, Map<String, Integer>> componentMappingVersionMap) {
// get relevant versions
Map<VersionType, Long> dossierTemplateVersions = dossierTemplateVersionMap.computeIfAbsent(dossier.getDossierTemplateId(),
@ -162,7 +173,8 @@ public class ReanalysisRequiredStatusService {
Map<String, Integer> componentMappingVersions = componentMappingVersionMap.computeIfAbsent(dossier.getDossierTemplateId(),
k -> componentMappingService.getMetaDataByDossierTemplateId(dossier.getDossierTemplateId())
.stream()
.collect(Collectors.toMap(ComponentMappingMetadata::getName, ComponentMappingMetadata::getVersion)));
.collect(Collectors.toMap(ComponentMappingMetadata::getName,
ComponentMappingMetadata::getVersion)));
Long dossierDictionaryVersion = dossierVersionMap.computeIfAbsent(fileStatus.getDossierId(), k -> getDossierVersionData(fileStatus.getDossierId()));
@ -177,22 +189,23 @@ public class ReanalysisRequiredStatusService {
var dossierDictionaryVersionMatches = Math.max(fileStatus.getDossierDictionaryVersion(), 0) == dossierDictionaryVersion;
var reanalysisRequired = !dictionaryVersionMatches || !dossierDictionaryVersionMatches || !mappingVersionAllMatch;
var fullAnalysisRequired = !rulesVersionMatches || !componentRulesVersionMatches || !legalBasisVersionMatches || !aiVersionMatches;
var fullAnalysisRequired = !rulesVersionMatches || !componentRulesVersionMatches || !legalBasisVersionMatches || (settings.isLlmNerServiceEnabled() && !aiVersionMatches);
var componentReanalysisRequired = !dateFormatsVersionMatches;
if (reanalysisRequired || fullAnalysisRequired || componentReanalysisRequired) {
log.debug(buildMessage(fileStatus,
rulesVersionMatches,
dossierTemplateVersions,
componentRulesVersionMatches,
dateFormatsVersionMatches,
dictionaryVersionMatches,
legalBasisVersionMatches,
dossierDictionaryVersionMatches,
dossierDictionaryVersion,
mappingVersionAllMatch,
componentMappingVersions));
rulesVersionMatches,
dossierTemplateVersions,
componentRulesVersionMatches,
dateFormatsVersionMatches,
dictionaryVersionMatches,
legalBasisVersionMatches,
dossierDictionaryVersionMatches,
dossierDictionaryVersion,
mappingVersionAllMatch,
componentMappingVersions,
aiVersionMatches));
}
return new AnalysisRequiredResult(reanalysisRequired, fullAnalysisRequired, componentReanalysisRequired);
@ -202,14 +215,15 @@ public class ReanalysisRequiredStatusService {
private String buildMessage(FileModel fileStatus,
boolean rulesVersionMatches,
Map<VersionType, Long> dossierTemplateVersions,
boolean versionMatches,
boolean componentRulesVersionMatches,
boolean dateFormatsVersionMatches,
boolean dictionaryVersionMatches,
boolean legalBasisVersionMatches,
boolean dossierDictionaryVersionMatches,
Long dossierDictionaryVersion,
boolean mappingVersionAllMatch,
Map<String, Integer> componentMappingVersions) {
Map<String, Integer> componentMappingVersions,
boolean aiVersionMatches) {
StringBuilder messageBuilder = new StringBuilder();
messageBuilder.append("For file: ").append(fileStatus.getId()).append("-").append(fileStatus.getFilename()).append(" analysis is required because -> ");
@ -264,6 +278,21 @@ public class ReanalysisRequiredStatusService {
needComma = true;
}
if (!dateFormatsVersionMatches) {
if (needComma) {
messageBuilder.append(", ");
}
messageBuilder.append("dateFormatsVersions: ").append(fileStatus.getDateFormatsVersion()).append("/").append(dossierTemplateVersions.getOrDefault(DATE_FORMATS, -1L));
needComma = true;
}
if (settings.isLlmNerServiceEnabled() && !aiVersionMatches) {
if (needComma) {
messageBuilder.append(", ");
}
messageBuilder.append("aiVersions: ").append(fileStatus.getAiCreationVersion()).append("/").append(dossierTemplateVersions.getOrDefault(AI_CREATION, 0L));
}
return messageBuilder.toString();
}

View File

@ -10,7 +10,6 @@ import com.iqser.red.service.persistence.management.v1.processor.exception.Confl
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.DossierPersistenceService;
import com.iqser.red.service.persistence.service.v1.api.shared.model.ReanalysisSettings;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.entitylog.imported.ImportedRedactions;
import com.iqser.red.service.persistence.service.v1.api.shared.model.annotations.DeleteImportedRedactionsRequest;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileModel;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileType;
@ -178,11 +177,11 @@ public class ReanalysisService {
relevantFiles.stream()
.filter(fileStatus -> fileStatus.getOcrStartTime() == null)
.filter(fileStatus -> fileStatus.getProcessingStatus().equals(ProcessingStatus.PROCESSED))
.forEach(fileStatus -> fileStatusService.setStatusOcrQueued(dossierId, fileStatus.getId()));
.forEach(fileStatus -> fileStatusService.setStatusOcrQueued(dossierId, fileStatus.getId(), false));
}
public void ocrFile(String dossierId, String fileId, boolean force) {
public void ocrFile(String dossierId, String fileId, boolean force, boolean allPages) {
dossierPersistenceService.getAndValidateDossier(dossierId);
FileModel dossierFile = fileStatusService.getStatus(fileId);
@ -202,18 +201,18 @@ public class ReanalysisService {
}
if (force) {
fileStatusService.setStatusOcrQueued(dossierId, fileId);
fileStatusService.setStatusOcrQueued(dossierId, fileId, allPages);
} else {
if (dossierFile.getOcrStartTime() != null) {
throw new ConflictException("File already has been OCR processed");
}
ocrFiles(dossierId, Sets.newHashSet(fileId));
ocrFiles(dossierId, Sets.newHashSet(fileId), allPages);
}
}
public void ocrFiles(String dossierId, Set<String> fileIds) {
public void ocrFiles(String dossierId, Set<String> fileIds, boolean allPages) {
var relevantFiles = getRelevantFiles(dossierId, fileIds);
@ -225,7 +224,7 @@ public class ReanalysisService {
relevantFiles.stream()
.filter(fileStatus -> fileStatus.getOcrStartTime() == null)
.forEach(fileStatus -> fileStatusService.setStatusOcrQueued(dossierId, fileStatus.getId()));
.forEach(fileStatus -> fileStatusService.setStatusOcrQueued(dossierId, fileStatus.getId(), allPages));
}

View File

@ -14,7 +14,6 @@ import org.springframework.stereotype.Service;
import com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.SaasMigrationStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.settings.FileManagementServiceSettings;
import com.iqser.red.service.persistence.management.v1.processor.utils.TenantUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileModel;
@ -38,7 +37,6 @@ public class AutomaticAnalysisJob implements Job {
private final FileStatusService fileStatusService;
private final TenantProvider tenantProvider;
private final ObservationRegistry observationRegistry;
private final SaasMigrationStatusPersistenceService saasMigrationStatusPersistenceService;
@Setter
private boolean schedulingStopped;
@ -69,11 +67,6 @@ public class AutomaticAnalysisJob implements Job {
TenantContext.setTenantId(tenant.getTenantId());
if (!saasMigrationStatusPersistenceService.migrationFinishedForTenant()) {
log.info("[Tenant:{}] Skipping scheduling as there are files that require migration.", tenant.getTenantId());
return;
}
String queueName = MessagingConfiguration.REDACTION_REQUEST_QUEUE_PREFIX + "_" + tenant.getTenantId();
var redactionQueueInfo = amqpAdmin.getQueueInfo(queueName);
if (redactionQueueInfo != null) {

View File

@ -21,24 +21,23 @@ import com.iqser.red.service.persistence.management.v1.processor.service.persist
import com.iqser.red.service.persistence.management.v1.processor.utils.StorageIdUtils;
import com.iqser.red.service.persistence.management.v1.processor.utils.TenantUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileType;
import com.iqser.red.service.redaction.v1.server.data.DocumentPageProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentPositionDataProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentStructureProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentTextDataProto;
import com.iqser.red.service.redaction.v1.server.data.EntryDataProto;
import com.iqser.red.service.redaction.v1.server.data.LayoutEngineProto;
import com.iqser.red.service.redaction.v1.server.data.NodeTypeProto;
import com.iqser.red.service.redaction.v1.server.data.old.DocumentPage;
import com.iqser.red.service.redaction.v1.server.data.old.DocumentPositionData;
import com.iqser.red.service.redaction.v1.server.data.old.DocumentStructure;
import com.iqser.red.service.redaction.v1.server.data.old.DocumentTextData;
import com.iqser.red.service.redaction.v1.server.data.old.LayoutEngine;
import com.iqser.red.storage.commons.exception.StorageObjectDoesNotExist;
import com.iqser.red.storage.commons.service.StorageService;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPage;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPageProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPositionData;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPositionDataProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructure;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructureProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentTextData;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentTextDataProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.EntryDataProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.LayoutEngine;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.LayoutEngineProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.NodeTypeProto;
import com.knecon.fforesight.tenantcommons.TenantContext;
import com.knecon.fforesight.tenantcommons.TenantProvider;
import jakarta.transaction.Transactional;
import lombok.RequiredArgsConstructor;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;

View File

@ -0,0 +1,57 @@
package com.iqser.red.service.persistence.management.v1.processor.service.persistence;
import java.time.OffsetDateTime;
import java.util.Optional;
import org.springframework.stereotype.Service;
import org.springframework.validation.annotation.Validated;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.AppVersionEntity;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.AppVersionEntityKey;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.AppVersionRepository;
import jakarta.transaction.Transactional;
import jakarta.validation.ConstraintViolationException;
import jakarta.validation.constraints.NotBlank;
import lombok.RequiredArgsConstructor;
@Service
@Validated
@RequiredArgsConstructor
public class AppVersionPersistenceService {
private final AppVersionRepository appVersionRepository;
/**
* Inserts a new AppVersionEntity if it does not already exist in the repository.
*
* @param appVersion the version of the application, must not be blank
* @param layoutParserVersion the version of the layout parser, must not be blank
* @return {@code true} if the entity was inserted; {@code false} if it already exists
* @throws ConstraintViolationException if the input parameters fail validation
*/
@Transactional
public boolean insertIfNotExists(@NotBlank String appVersion, @NotBlank String layoutParserVersion) throws ConstraintViolationException {
AppVersionEntityKey key = new AppVersionEntityKey(appVersion, layoutParserVersion);
if (!appVersionRepository.existsById(key)) {
AppVersionEntity newAppVersion = new AppVersionEntity(appVersion, layoutParserVersion, OffsetDateTime.now());
appVersionRepository.save(newAppVersion);
return true;
}
return false;
}
/**
* Retrieves the latest AppVersionEntity based on the date field.
*
* @return an Optional containing the latest AppVersionEntity if present, otherwise empty
*/
public Optional<AppVersionEntity> getLatestAppVersion() {
return appVersionRepository.findTopByOrderByDateDesc();
}
}

View File

@ -12,6 +12,7 @@ import org.springframework.beans.BeanUtils;
import org.springframework.stereotype.Service;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.TypeEntity;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import com.iqser.red.service.persistence.management.v1.processor.exception.ConflictException;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.DossierRepository;
@ -151,6 +152,7 @@ public class DictionaryPersistenceService {
type.setAiDescription(typeValueRequest.getAiDescription());
type.setLabel(typeValueRequest.getLabel());
type.setAddToDictionaryAction(typeValueRequest.isAddToDictionaryAction());
type.setRank(typeValueRequest.getRank());
} else {
BeanUtils.copyProperties(typeValueRequest,
type,
@ -356,12 +358,14 @@ public class DictionaryPersistenceService {
@Transactional
public void updateAddToDictionary(String typeId, boolean addToDictionary) {
public void updateAddToDictionary(String typeId, boolean addToDictionaryAction) {
var typeEntity = getType(typeId);
if (typeEntity.isDossierDictionaryOnly()) {
typeEntity.setAddToDictionaryAction(addToDictionary);
typeEntity.setAddToDictionaryAction(addToDictionaryAction);
typeRepository.saveAndFlush(typeEntity);
} else {
throw new BadRequestException("The addToDictionaryAction flag can only be adjusted for dossierDictionaryOnly-types.");
}
}

View File

@ -60,7 +60,7 @@ public class DossierAttributeConfigPersistenceService {
dossierAttributeConfig.setDossierTemplate(dossierTemplate);
if (dossierAttributeConfig.getId() == null) {
dossierAttributeConfig.setId(UUID.randomUUID().toString());
setPlaceholder(dossierAttributeConfig);
checkAndSetPlaceholder(dossierAttributeConfig);
uniqueLabelAndPlaceholder(dossierAttributeConfig);
return dossierAttributeConfigRepository.save(dossierAttributeConfig);
} else {
@ -72,12 +72,15 @@ public class DossierAttributeConfigPersistenceService {
config.setLabel(dossierAttributeConfig.getLabel());
config.setType(dossierAttributeConfig.getType());
config.setEditable(dossierAttributeConfig.isEditable());
setPlaceholder(config);
config.setDisplayedInDossierList(dossierAttributeConfig.isDisplayedInDossierList());
config.setFilterable(dossierAttributeConfig.isFilterable());
checkAndSetPlaceholder(dossierAttributeConfig);
config.setPlaceholder(dossierAttributeConfig.getPlaceholder());
uniqueLabelAndPlaceholder(dossierAttributeConfig);
return dossierAttributeConfigRepository.save(config);
} else {
dossierAttributeConfig.setId(UUID.randomUUID().toString());
setPlaceholder(dossierAttributeConfig);
checkAndSetPlaceholder(dossierAttributeConfig);
uniqueLabelAndPlaceholder(dossierAttributeConfig);
return dossierAttributeConfigRepository.save(dossierAttributeConfig);
}
@ -92,12 +95,25 @@ public class DossierAttributeConfigPersistenceService {
}
private void setPlaceholder(DossierAttributeConfigEntity dossierAttributeConfig) {
private void checkAndSetPlaceholder(DossierAttributeConfigEntity dossierAttributeConfig) {
String placeholder = "{{dossier.attribute." + StringUtils.remove(WordUtils.capitalizeFully(dossierAttributeConfig.getLabel(), ' '), " ") + "}}";
dossierAttributeConfig.setPlaceholder(placeholder);
if (StringUtils.isEmpty(dossierAttributeConfig.getPlaceholder())) {
String placeholder = "{{dossier.attribute." + StringUtils.remove(WordUtils.capitalizeFully(dossierAttributeConfig.getLabel(), ' '), " ") + "}}";
dossierAttributeConfig.setPlaceholder(placeholder);
} else {
uniquePlaceholder(dossierAttributeConfig);
}
}
private void uniquePlaceholder(DossierAttributeConfigEntity dossierAttributesConfig) {
getDossierAttributes(dossierAttributesConfig.getDossierTemplate().getId()).stream()
.filter(d -> !d.getId().equals(dossierAttributesConfig.getId()))
.forEach(other -> {
if (other.getPlaceholder().equalsIgnoreCase(dossierAttributesConfig.getPlaceholder())) {
throw new ConflictException("Placeholder already exists. " + dossierAttributesConfig.getPlaceholder());
}
});
}
private void uniqueLabelAndPlaceholder(DossierAttributeConfigEntity dossierAttributesConfig) {

View File

@ -34,19 +34,47 @@ public class DownloadStatusPersistenceService {
// use this to create a status for export dossier template.
public void createStatus(String userId, String storageId, String filename, String mimeType) {
this.createStatus(userId, storageId, null, filename, mimeType, null, null, null, null);
this.saveDownloadStatus(userId, storageId, null, filename, mimeType, null, null, null, null);
}
@Transactional
public DownloadStatusEntity createStatus(String userId,
String storageId,
DossierEntity dossier,
String filename,
String mimeType,
List<String> fileIds,
Set<DownloadFileType> downloadFileTypes,
List<String> reportTemplateIds,
String redactionPreviewColor) {
String storageId,
DossierEntity dossier,
String filename,
String mimeType,
List<String> fileIds,
Set<DownloadFileType> downloadFileTypes,
List<String> reportTemplateIds,
String redactionPreviewColor) {
DownloadStatusEntity savedDownloadStatus = saveDownloadStatus(userId,
storageId,
dossier,
filename,
mimeType,
fileIds,
downloadFileTypes,
reportTemplateIds,
redactionPreviewColor);
if (fileIds != null && !fileIds.isEmpty()) {
fileRepository.updateLastDownloadForApprovedFiles(fileIds, savedDownloadStatus);
}
return savedDownloadStatus;
}
private DownloadStatusEntity saveDownloadStatus(String userId,
String storageId,
DossierEntity dossier,
String filename,
String mimeType,
List<String> fileIds,
Set<DownloadFileType> downloadFileTypes,
List<String> reportTemplateIds,
String redactionPreviewColor) {
DownloadStatusEntity downloadStatus = new DownloadStatusEntity();
downloadStatus.setUserId(userId);
@ -62,7 +90,7 @@ public class DownloadStatusPersistenceService {
downloadStatus.setRedactionPreviewColor(redactionPreviewColor);
downloadStatus.setStatus(DownloadStatusValue.QUEUED);
return downloadStatusRepository.save(downloadStatus);
return downloadStatusRepository.saveAndFlush(downloadStatus);
}
@ -134,7 +162,8 @@ public class DownloadStatusPersistenceService {
@Transactional
public DownloadStatusEntity getStatusesByUuid(String uuid) {
return downloadStatusRepository.findByUuid(uuid).orElseThrow(() -> new NotFoundException(String.format("DownloadStatus not found for uuid: %s", uuid)));
return downloadStatusRepository.findByUuid(uuid)
.orElseThrow(() -> new NotFoundException(String.format("DownloadStatus not found for uuid: %s", uuid)));
}

View File

@ -1,8 +1,8 @@
package com.iqser.red.service.persistence.management.v1.processor.service.persistence;
import static com.iqser.red.service.persistence.management.v1.processor.service.FileAttributesManagementService.ASCII_ENCODING;
import static com.iqser.red.service.persistence.management.v1.processor.service.FileAttributesManagementService.ISO_ENCODING;
import static com.iqser.red.service.persistence.management.v1.processor.service.FileAttributesManagementService.UTF_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ASCII_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ISO_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.UTF_ENCODING;
import java.util.List;
import java.util.Objects;
@ -81,7 +81,7 @@ public class FileAttributeConfigPersistenceService {
@Transactional
public List<FileAttributeConfigEntity> setFileAttributesConfig(String dossierTemplateId, List<FileAttributeConfigEntity> fileAttributesConfig) {
public void setFileAttributesConfig(String dossierTemplateId, List<FileAttributeConfigEntity> fileAttributesConfig) {
Set<String> toSetIds = fileAttributesConfig.stream()
.map(FileAttributeConfigEntity::getId)
@ -98,8 +98,6 @@ public class FileAttributeConfigPersistenceService {
fileAttributesRepository.deleteByFileAttributeConfigId(ctr.getId());
fileAttributeConfigRepository.deleteById(ctr.getId());
});
return getFileAttributes(dossierTemplateId);
}
@ -117,7 +115,7 @@ public class FileAttributeConfigPersistenceService {
}
if (fileAttributeConfig.getId() == null) {
fileAttributeConfig.setId(UUID.randomUUID().toString());
setPlaceholder(fileAttributeConfig);
checkAndSetPlaceholder(fileAttributeConfig);
uniqueLabelAndPlaceholder(fileAttributeConfig);
return fileAttributeConfigRepository.save(fileAttributeConfig);
} else {
@ -133,12 +131,14 @@ public class FileAttributeConfigPersistenceService {
config.setDisplayedInFileList(fileAttributeConfig.isDisplayedInFileList());
config.setPrimaryAttribute(fileAttributeConfig.isPrimaryAttribute());
config.setFilterable(fileAttributeConfig.isFilterable());
setPlaceholder(config);
config.setIncludeInCsvExport(fileAttributeConfig.isIncludeInCsvExport());
checkAndSetPlaceholder(fileAttributeConfig);
config.setPlaceholder(fileAttributeConfig.getPlaceholder());
uniqueLabelAndPlaceholder(fileAttributeConfig);
return fileAttributeConfigRepository.save(config);
} else {
fileAttributeConfig.setId(UUID.randomUUID().toString());
setPlaceholder(fileAttributeConfig);
checkAndSetPlaceholder(fileAttributeConfig);
uniqueLabelAndPlaceholder(fileAttributeConfig);
return fileAttributeConfigRepository.save(fileAttributeConfig);
}
@ -153,12 +153,25 @@ public class FileAttributeConfigPersistenceService {
}
private void setPlaceholder(FileAttributeConfigEntity fileAttributeConfig) {
private void checkAndSetPlaceholder(FileAttributeConfigEntity fileAttributeConfig) {
String placeholder = "{{file.attribute." + StringUtils.remove(WordUtils.capitalizeFully(fileAttributeConfig.getLabel(), ' '), " ") + "}}";
fileAttributeConfig.setPlaceholder(placeholder);
if (StringUtils.isEmpty(fileAttributeConfig.getPlaceholder())) {
String placeholder = "{{file.attribute." + StringUtils.remove(WordUtils.capitalizeFully(fileAttributeConfig.getLabel(), ' '), " ") + "}}";
fileAttributeConfig.setPlaceholder(placeholder);
} else {
uniquePlaceholder(fileAttributeConfig);
}
}
private void uniquePlaceholder(FileAttributeConfigEntity fileAttributeConfig) {
getFileAttributes(fileAttributeConfig.getDossierTemplate().getId()).stream()
.filter(d -> !d.getId().equals(fileAttributeConfig.getId()))
.forEach(other -> {
if (other.getPlaceholder().equalsIgnoreCase(fileAttributeConfig.getPlaceholder())) {
throw new ConflictException("Placeholder already exists.");
}
});
}
private void uniqueLabelAndPlaceholder(FileAttributeConfigEntity fileAttributeConfig) {

View File

@ -3,7 +3,6 @@ package com.iqser.red.service.persistence.management.v1.processor.service.persis
import java.time.OffsetDateTime;
import java.time.temporal.ChronoUnit;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
@ -603,7 +602,9 @@ public class FileStatusPersistenceService {
public int getNumberOfAssignedFiles(String userId) {
List<FileEntity> files = fileRepository.findFilesByAssignee(userId);
return files.size();
return files.stream()
.filter(fileEntity -> fileEntity.getHardDeletedTime() == null)
.collect(Collectors.toList()).size();
}
@ -691,9 +692,9 @@ public class FileStatusPersistenceService {
}
public void updateLayoutProcessedTime(String fileId) {
public void updateLayoutParserVersionAndProcessedTime(String fileId, String version) {
fileRepository.updateLayoutProcessedTime(fileId, OffsetDateTime.now().truncatedTo(ChronoUnit.MILLIS));
fileRepository.updateLayoutProcessedTime(fileId, version, OffsetDateTime.now().truncatedTo(ChronoUnit.MILLIS));
}
@ -738,11 +739,14 @@ public class FileStatusPersistenceService {
fileRepository.setAiCreationVersion(fileId, aiCreationVersion);
}
public List<FileEntity> findAllByIds(Set<String> fileIds) {
return fileRepository.findAllById(fileIds);
}
public List<FileEntity> findAllDossierIdAndIds(String dossierId, Set<String> fileIds) {
return fileRepository.findAllDossierIdAndIds(dossierId, fileIds);
@ -754,4 +758,11 @@ public class FileStatusPersistenceService {
return fileRepository.findAllByDossierId(dossierId, includeDeleted);
}
@Transactional
public void clearLastDownload(String fileId) {
fileRepository.updateLastDownloadForFile(fileId, null);
}
}

View File

@ -1,96 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.service.persistence;
import com.iqser.red.service.persistence.management.v1.processor.entity.annotations.AnnotationEntityId;
import com.iqser.red.service.persistence.management.v1.processor.entity.migration.SaasMigrationStatusEntity;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotFoundException;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository.SaasMigrationStatusRepository;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus;
import jakarta.transaction.Transactional;
import lombok.RequiredArgsConstructor;
import org.springframework.stereotype.Service;
import java.util.List;
@Service
@RequiredArgsConstructor
public class SaasMigrationStatusPersistenceService {
private final SaasMigrationStatusRepository saasMigrationStatusRepository;
public List<SaasMigrationStatusEntity> findAllByStatus(SaasMigrationStatus status) {
return saasMigrationStatusRepository.findAllByStatus(status);
}
public SaasMigrationStatusEntity findById(String fileId) {
var migrationStatusOptional = saasMigrationStatusRepository.findById(fileId);
if (migrationStatusOptional.isPresent()) {
return migrationStatusOptional.get();
}
throw new NotFoundException("No migration entry found for fileId" + fileId);
}
public boolean isMigrating(String fileId) {
var migrationStatusOptional = saasMigrationStatusRepository.findById(fileId);
return migrationStatusOptional.isPresent() && migrationStatusOptional.get().getStatus() != SaasMigrationStatus.FINISHED;
}
public boolean migrationFinishedForTenant() {
return saasMigrationStatusRepository.findAllWhereStatusNotFinishedAndNotError() == 0;
}
@Transactional
public void createMigrationRequiredStatus(String dossierId, String fileId) {
saasMigrationStatusRepository.save(SaasMigrationStatusEntity.builder().fileId(fileId).dossierId(dossierId).status(SaasMigrationStatus.MIGRATION_REQUIRED).build());
}
@Transactional
public void updateStatus(String fileId, SaasMigrationStatus status) {
saasMigrationStatusRepository.updateStatus(fileId, status);
}
@Transactional
public void updateErrorStatus(String fileId, String errorCause) {
saasMigrationStatusRepository.updateErrorStatus(fileId, SaasMigrationStatus.ERROR, errorCause);
}
@Transactional
public void updateErrorCounter(String fileId, Integer processingErrorCounter, String errorCause) {
saasMigrationStatusRepository.updateErrorCounter(fileId, processingErrorCounter, errorCause);
}
public int countByStatus(SaasMigrationStatus status) {
return saasMigrationStatusRepository.countByStatus(status);
}
public int countAll() {
return saasMigrationStatusRepository.countAll();
}
public List<SaasMigrationStatusEntity> findAll() {
return saasMigrationStatusRepository.findAll();
}
}

View File

@ -0,0 +1,14 @@
package com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository;
import java.util.Optional;
import org.springframework.data.jpa.repository.JpaRepository;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.AppVersionEntity;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.AppVersionEntityKey;
public interface AppVersionRepository extends JpaRepository<AppVersionEntity, AppVersionEntityKey> {
Optional<AppVersionEntity> findTopByOrderByDateDesc();
}

View File

@ -15,7 +15,7 @@ public interface FileAttributeConfigRepository extends JpaRepository<FileAttribu
List<FileAttributeConfigEntity> findByDossierTemplateId(String dossierTemplateId);
@Modifying
@Modifying(clearAutomatically = true)
@Query("update FileAttributeConfigEntity e set e.primaryAttribute = false where e.dossierTemplate.id = :dossierTemplateId")
void updateAllPrimaryAttributeValuesToFalse(@Param("dossierTemplateId") String dossierTemplateId);

View File

@ -12,8 +12,8 @@ import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.FileEntity;
import com.iqser.red.service.persistence.management.v1.processor.entity.download.DownloadStatusEntity;
import com.iqser.red.service.persistence.management.v1.processor.entity.projection.DossierStatsFileProjection;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.projection.DossierChangeProjection;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.projection.FileChangeProjection;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.projection.FilePageCountsProjection;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.projection.FileProcessingStatusProjection;
@ -142,8 +142,7 @@ public interface FileRepository extends JpaRepository<FileEntity, String> {
@Modifying(clearAutomatically = true)
@Query("update FileEntity f set f.processingErrorCounter = :processingErrorCounter "
+ "where f.dossierId in (select fe.dossierId from FileEntity fe inner join DossierEntity d on d.id = fe.dossierId where d.dossierTemplateId = :dossierTemplateId) and f.processingStatus = 'ERROR'")
void updateErrorCounter(@Param("dossierTemplateId") String dossierTemplateId,
@Param("processingErrorCounter") int processingErrorCounter);
void updateErrorCounter(@Param("dossierTemplateId") String dossierTemplateId, @Param("processingErrorCounter") int processingErrorCounter);
@Modifying
@ -401,8 +400,8 @@ public interface FileRepository extends JpaRepository<FileEntity, String> {
@Transactional
@Modifying(clearAutomatically = true)
@Query(value = "update FileEntity f set f.lastLayoutProcessed = :offsetDateTime where f.id = :fileId")
void updateLayoutProcessedTime(@Param("fileId") String fileId, @Param("offsetDateTime") OffsetDateTime offsetDateTime);
@Query(value = "update FileEntity f set f.layoutParserVersion = :version, f.lastLayoutProcessed = :offsetDateTime where f.id = :fileId")
void updateLayoutProcessedTime(@Param("fileId") String fileId, @Param("version") String version, @Param("offsetDateTime") OffsetDateTime offsetDateTime);
@Query("select f.filename from FileEntity f where f.id = :fileId")
@ -467,6 +466,16 @@ public interface FileRepository extends JpaRepository<FileEntity, String> {
List<String> findAllByDossierId(@Param("dossierId") String dossierId, @Param("includeDeleted") boolean includeDeleted);
@Modifying
@Query("UPDATE FileEntity f SET f.lastDownload = :lastDownload WHERE f.id IN :fileIds AND f.workflowStatus = 'APPROVED'")
void updateLastDownloadForApprovedFiles(@Param("fileIds") List<String> fileIds, @Param("lastDownload") DownloadStatusEntity lastDownload);
@Modifying
@Query("UPDATE FileEntity f SET f.lastDownload = :lastDownload WHERE f.id = :fileId")
void updateLastDownloadForFile(@Param("fileId") String fileId, @Param("lastDownload") DownloadStatusEntity lastDownload);
@Query("SELECT f FROM FileEntity f WHERE f.id in :fileIds AND f.dossierId = :dossierId")
List<FileEntity> findAllDossierIdAndIds(@Param("dossierId") String dossierId, @Param("fileIds") Set<String> fileIds);

View File

@ -1,44 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.service.persistence.repository;
import com.iqser.red.service.persistence.management.v1.processor.entity.migration.SaasMigrationStatusEntity;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.SaasMigrationStatus;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param;
import java.util.List;
public interface SaasMigrationStatusRepository extends JpaRepository<SaasMigrationStatusEntity, String> {
List<SaasMigrationStatusEntity> findAllByStatus(SaasMigrationStatus status);
@Modifying
@Query("update SaasMigrationStatusEntity e set e.status = :status where e.fileId = :fileId")
void updateStatus(@Param("fileId") String fileId, @Param("status") SaasMigrationStatus status);
@Modifying
@Query("update SaasMigrationStatusEntity e set e.status = :status, e.errorCause = :errorCause where e.fileId = :fileId")
void updateErrorStatus(@Param("fileId") String fileId, @Param("status") SaasMigrationStatus status, @Param("errorCause") String errorCause);
@Modifying
@Query("update SaasMigrationStatusEntity e set e.processingErrorCounter = :processingErrorCounter, e.errorCause = :errorCause where e.fileId = :fileId")
void updateErrorCounter(@Param("fileId") String fileId, @Param("processingErrorCounter") Integer processingErrorCounter, @Param("errorCause") String errorCause);
@Query("select count(*) from SaasMigrationStatusEntity e where e.status = :status")
int countByStatus(@Param("status") SaasMigrationStatus status);
@Query("select count(*) from SaasMigrationStatusEntity")
int countAll();
@Query("select count(*) from SaasMigrationStatusEntity e where e.status != 'FINISHED' and e.status != 'ERROR'")
int findAllWhereStatusNotFinishedAndNotError();
}

View File

@ -11,14 +11,12 @@ import org.springframework.stereotype.Service;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration;
import com.iqser.red.service.persistence.management.v1.processor.migration.SaasMigrationService;
import com.iqser.red.service.persistence.management.v1.processor.model.websocket.AnalyseStatus;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusProcessingUpdateService;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
import com.iqser.red.service.persistence.management.v1.processor.service.ImageSimilarityService;
import com.iqser.red.service.persistence.management.v1.processor.service.websocket.WebsocketService;
import com.iqser.red.service.persistence.management.v1.processor.service.layoutparsing.QueueMessageIdentifierService;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.SaasMigrationStatusPersistenceService;
import com.iqser.red.service.persistence.management.v1.processor.utils.StorageIdUtils;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileErrorInfo;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileType;
@ -40,8 +38,6 @@ public class LayoutParsingFinishedMessageReceiver {
private final FileStatusService fileStatusService;
private final FileStatusProcessingUpdateService fileStatusProcessingUpdateService;
private final ObjectMapper objectMapper;
private final SaasMigrationStatusPersistenceService saasMigrationStatusPersistenceService;
private final SaasMigrationService saasMigrationService;
private final ImageSimilarityService imageSimilarityService;
private final WebsocketService websocketService;
@ -53,24 +49,18 @@ public class LayoutParsingFinishedMessageReceiver {
var dossierId = QueueMessageIdentifierService.parseDossierId(response.identifier());
var fileId = QueueMessageIdentifierService.parseFileId(response.identifier());
log.info("Layout parsing has finished for {}/{} in {}", dossierId, fileId, LayoutParsingQueueNames.LAYOUT_PARSING_RESPONSE_EXCHANGE);
if (saasMigrationStatusPersistenceService.isMigrating(QueueMessageIdentifierService.parseFileId(response.identifier()))) {
saasMigrationService.handleLayoutParsingFinished(QueueMessageIdentifierService.parseDossierId(response.identifier()),
QueueMessageIdentifierService.parseFileId(response.identifier()));
return;
}
var templateId = "";
var storageId = StorageIdUtils.getStorageId(dossierId, fileId, FileType.DOCUMENT_STRUCTURE);
fileStatusService.setStatusAnalyse(QueueMessageIdentifierService.parseDossierId(response.identifier()),
QueueMessageIdentifierService.parseFileId(response.identifier()),
QueueMessageIdentifierService.parsePriority(response.identifier()));
fileStatusService.updateLayoutProcessedTime(QueueMessageIdentifierService.parseFileId(response.identifier()));
fileStatusService.updateLayoutParserVersionAndProcessedTime(QueueMessageIdentifierService.parseFileId(response.identifier()), response.layoutParserVersion());
websocketService.sendAnalysisEvent(dossierId, fileId, AnalyseStatus.LAYOUT_UPDATE, fileStatusService.getStatus(fileId).getNumberOfAnalyses() + 1);
try {
imageSimilarityService.saveImages(templateId, dossierId, fileId);
imageSimilarityService.saveImages("", dossierId, fileId);
} catch (Exception e) {
log.error("Error occured during save images: {} ", e.getMessage(), e);
throw e;
@ -90,13 +80,6 @@ public class LayoutParsingFinishedMessageReceiver {
if (errorCause == null) {
errorCause = "Error occured during layout parsing!";
}
if (saasMigrationStatusPersistenceService.isMigrating(QueueMessageIdentifierService.parseFileId(analyzeRequest.identifier()))) {
saasMigrationService.handleError(QueueMessageIdentifierService.parseDossierId(analyzeRequest.identifier()),
QueueMessageIdentifierService.parseFileId(analyzeRequest.identifier()),
errorCause,
LayoutParsingQueueNames.LAYOUT_PARSING_REQUEST_EXCHANGE);
return;
}
OffsetDateTime timestamp = failedMessage.getMessageProperties().getHeader(MessagingConfiguration.X_ERROR_INFO_TIMESTAMP_HEADER);
timestamp = timestamp != null ? timestamp : OffsetDateTime.now().truncatedTo(ChronoUnit.MILLIS);

View File

@ -1,53 +0,0 @@
package com.iqser.red.service.persistence.management.v1.processor.service.queue;
import static com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration.MIGRATION_DLQ;
import static com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration.MIGRATION_REQUEST_QUEUE;
import static com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration.MIGRATION_RESPONSE_QUEUE;
import static com.iqser.red.service.persistence.management.v1.processor.configuration.MessagingConfiguration.X_ERROR_INFO_HEADER;
import org.springframework.amqp.core.Message;
import org.springframework.amqp.rabbit.annotation.RabbitListener;
import org.springframework.stereotype.Service;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.iqser.red.service.persistence.management.v1.processor.migration.SaasMigrationService;
import com.iqser.red.service.redaction.v1.model.MigrationRequest;
import com.iqser.red.service.redaction.v1.model.MigrationResponse;
import lombok.RequiredArgsConstructor;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
@Slf4j
@Service
@RequiredArgsConstructor
public class RedactionServiceSaasMigrationMessageReceiver {
private final SaasMigrationService saasMigrationService;
private final ObjectMapper objectMapper;
@SneakyThrows
@RabbitListener(queues = MIGRATION_RESPONSE_QUEUE)
public void receive(MigrationResponse response) {
saasMigrationService.handleEntityLogMigrationFinished(response.getDossierId(), response.getFileId());
log.info("Received message {} in {}", response, MIGRATION_RESPONSE_QUEUE);
}
@SneakyThrows
@RabbitListener(queues = MIGRATION_DLQ)
public void handleDLQMessage(Message failedMessage) {
var migrationRequest = objectMapper.readValue(failedMessage.getBody(), MigrationRequest.class);
String errorCause = failedMessage.getMessageProperties().getHeader(X_ERROR_INFO_HEADER);
if (errorCause == null) {
errorCause = "Error occured during entityLog migration!";
}
saasMigrationService.handleError(migrationRequest.getDossierId(), migrationRequest.getFileId(), errorCause, MIGRATION_REQUEST_QUEUE);
}
}

View File

@ -0,0 +1,16 @@
package com.iqser.red.service.persistence.management.v1.processor.utils;
import java.util.function.BiConsumer;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.DossierAttributeConfigEntity;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.DossierAttributeConfig;
public class DossierAttributeConfigMapper implements BiConsumer<DossierAttributeConfigEntity, DossierAttributeConfig> {
@Override
public void accept(DossierAttributeConfigEntity dossierAttributeConfigEntity, DossierAttributeConfig dossierAttributeConfig) {
dossierAttributeConfig.setDossierTemplateId(dossierAttributeConfigEntity.getDossierTemplate().getId());
}
}

View File

@ -0,0 +1,16 @@
package com.iqser.red.service.persistence.management.v1.processor.utils;
import java.util.function.BiConsumer;
import com.iqser.red.service.persistence.management.v1.processor.entity.dossier.FileAttributeConfigEntity;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.dossier.file.FileAttributeConfig;
public class FileAttributeConfigMapper implements BiConsumer<FileAttributeConfigEntity, FileAttributeConfig> {
@Override
public void accept(FileAttributeConfigEntity fileAttributeConfigEntity, FileAttributeConfig fileAttributeConfig) {
fileAttributeConfig.setDossierTemplateId(fileAttributeConfigEntity.getDossierTemplate().getId());
}
}

View File

@ -20,6 +20,9 @@ public class FileModelMapper implements BiConsumer<FileEntity, FileModel> {
.forEach(fa -> fileModel.getFileAttributes().put(fa.getFileAttributeId().getFileAttributeConfigId(), fa.getValue()));
fileModel.setFileErrorInfo(new FileErrorInfo(fileEntity.getErrorCause(), fileEntity.getErrorQueue(), fileEntity.getErrorService(), fileEntity.getErrorTimestamp(), fileEntity.getErrorCode()));
fileModel.setComponentMappingVersions(getComponentMappingVersions(fileEntity));
if (fileEntity.getLastDownload() != null) {
fileModel.setLastDownloadDate(fileEntity.getLastDownload().getCreationDate());
}
}

View File

@ -1,10 +1,15 @@
package com.iqser.red.service.persistence.management.v1.processor.utils;
import java.net.URLEncoder;
import java.nio.charset.Charset;
import java.nio.charset.IllegalCharsetNameException;
import java.nio.charset.StandardCharsets;
import java.nio.charset.UnsupportedCharsetException;
import org.apache.commons.lang3.StringUtils;
import com.iqser.red.service.persistence.management.v1.processor.exception.BadRequestException;
import lombok.experimental.UtilityClass;
@UtilityClass
@ -30,4 +35,17 @@ public final class StringEncodingUtils {
return result.toString();
}
public static Charset resolveCharset(String encoding) {
try {
return Charset.forName(encoding);
} catch (IllegalCharsetNameException e) {
throw new BadRequestException("Invalid character encoding: " + encoding);
} catch (UnsupportedCharsetException e) {
throw new BadRequestException("Unsupported character encoding: " + encoding);
} catch (IllegalArgumentException e) {
throw new BadRequestException("Encoding can't be null.");
}
}
}

View File

@ -242,10 +242,22 @@ databaseChangeLog:
- include:
file: db/changelog/tenant/149-remove-based-on-dict-annotation-id-columns.yaml
- include:
file: db/changelog/tenant/150-add-indexes-across-tables-for-performance.yaml
file: db/changelog/tenant/149-add-indexes-across-tables-for-performance.yaml
- include:
file: db/changelog/tenant/151-add-component-mapping-indexes.yaml
file: db/changelog/tenant/150-add-component-mapping-indexes.yaml
- include:
file: db/changelog/tenant/151-drop-saas-migration-table.changelog.yaml
- include:
file: db/changelog/tenant/152-add-ai-fields-to-entity.yaml
- include:
file: db/changelog/tenant/153-custom-technical-name-change.yaml
- include:
file: db/changelog/tenant/155-add-displayed-and-filterable-to-dossier-attribute-config.yaml
- include:
file: db/changelog/tenant/156-add-last-download-to-file.yaml
- include:
file: db/changelog/tenant/157-add-included-to-csv-export-field.yaml
- include:
file: db/changelog/tenant/158-add-app-version-history-table-and-layout-parser-version-field-to-file.yaml
- include:
file: db/changelog/tenant/159-cleanup-truncated-indices.yaml

View File

@ -2,6 +2,12 @@ databaseChangeLog:
- changeSet:
id: add-indexes-to-file-table
author: timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: file_dossier_id_last_updated_idx
tableName: file
changes:
- createIndex:
columns:
@ -12,6 +18,16 @@ databaseChangeLog:
indexName: file_dossier_id_last_updated_idx
unique: false
tableName: file
- changeSet:
id: add-indexes-to-file-table2
author: timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: file_dossier_id_deleted_hard_deleted_time_idx
tableName: file
changes:
- createIndex:
columns:
- column:

View File

@ -3,6 +3,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_file_last_updated
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_file_last_updated
@ -19,6 +20,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_file_deleted_hard_deleted_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_file_deleted_hard_deleted_time
@ -37,6 +39,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_file_dossier_id_deleted_hard_deleted_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_file_dossier_id_deleted_hard_deleted_time
@ -57,6 +60,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_dossier_last_updated
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossier_last_updated
@ -73,6 +77,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_dossier_soft_deleted_time_hard_deleted_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossier_soft_deleted_time_hard_deleted_time
@ -91,6 +96,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_dossier_id_soft_deleted_time_hard_deleted_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossier_id_soft_deleted_time_hard_deleted_time
@ -111,6 +117,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_dossier_soft_deleted_time_hard_deleted_time_archived_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossier_soft_deleted_time_hard_deleted_time_archived_time
@ -131,6 +138,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_dossier_dossier_template_id_soft_deleted_time_hard_deleted_time_archived_time
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossier_dossier_template_id_soft_deleted_time_hard_deleted_time_archived_time
@ -153,6 +161,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_notification_preference_user_id_in_app_notifications_enabled
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_notification_preference_user_id_in_app_notifications_enabled
@ -171,6 +180,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_notification_user_id_creation_date_soft_deleted
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_notification_user_id_creation_date_soft_deleted
@ -191,6 +201,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_entity_dossier_template_id
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_entity_dossier_template_id
@ -207,6 +218,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_entity_dossier_id
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_entity_dossier_id

View File

@ -3,6 +3,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_component_mappings_dossier_template_id
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_component_mappings_dossier_template_id
@ -19,6 +20,7 @@ databaseChangeLog:
id: create_index_if_not_exists_idx_component_mappings_dossier_template_id_name
author: Timo
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_component_mappings_dossier_template_id_name

View File

@ -0,0 +1,12 @@
databaseChangeLog:
- changeSet:
id: drop-saas_migration_status-if-exists
author: dom
comment: drop saas_migration_status if exists
preConditions:
- onFail: MARK_RAN
- tableExists:
tableName: saas_migration_status
changes:
- dropTable:
tableName: saas_migration_status

View File

@ -11,6 +11,7 @@ databaseChangeLog:
id: technical-name-change-index
author: maverick
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_legal_basis_mapping_entity_dossier_template_id

View File

@ -0,0 +1,16 @@
databaseChangeLog:
- changeSet:
id: 155-add-displayed-and-filterable-to-dossier-attribute-config
author: maverick
changes:
- addColumn:
columns:
- column:
name: displayed_in_dossier_list
type: BOOLEAN
defaultValueBoolean: false
- column:
name: filterable
type: BOOLEAN
defaultValueBoolean: false
tableName: dossier_attribute_config

View File

@ -0,0 +1,28 @@
databaseChangeLog:
- changeSet:
id: add-last-download-to-file
author: maverick
changes:
- addColumn:
tableName: file
columns:
- column:
name: last_download
type: VARCHAR(255)
constraints:
nullable: true
- changeSet:
id: add-fk-last-download
author: maverick
changes:
- addForeignKeyConstraint:
baseTableName: file
baseColumnNames: last_download
constraintName: fk_file_last_download
referencedTableName: download_status
referencedColumnNames: storage_id
onDelete: SET NULL
onUpdate: NO ACTION
deferrable: false
initiallyDeferred: false
validate: true

View File

@ -0,0 +1,12 @@
databaseChangeLog:
- changeSet:
id: add-included-to-csv-export-field
author: colariu
changes:
- addColumn:
columns:
- column:
name: include_in_csv_export
type: BOOLEAN
defaultValue: false
tableName: file_attribute_config

View File

@ -0,0 +1,37 @@
databaseChangeLog:
- changeSet:
id: add-app-version-history-table
author: maverick
changes:
- createTable:
tableName: app_version_history
columns:
- column:
name: app_version
type: VARCHAR(128)
constraints:
nullable: false
primaryKey: true
primaryKeyName: app_version_history_pkey
- column:
name: layout_parser_version
type: VARCHAR(128)
constraints:
nullable: false
primaryKey: true
primaryKeyName: app_version_history_pkey
- column:
name: date
type: TIMESTAMP WITHOUT TIME ZONE
constraints:
nullable: false
- changeSet:
id: add-layout-parser-version-to-file
author: maverick
changes:
- addColumn:
tableName: file
columns:
- column:
name: layout_parser_version
type: VARCHAR(128)

View File

@ -0,0 +1,68 @@
databaseChangeLog:
- changeSet:
id: drop_truncated_idx_dossier_dossier_template_id_soft_deleted_time_hard_deleted_time
author: maverick
preConditions:
- onFail: MARK_RAN
- indexExists:
indexName: idx_dossier_dossier_template_id_soft_deleted_time_hard_deleted_
tableName: dossier
changes:
- dropIndex:
indexName: idx_dossier_dossier_template_id_soft_deleted_time_hard_deleted_
tableName: dossier
- changeSet:
id: create_idx_dossiertemplid_softdeltime_harddeltime_archivtime
author: maverick
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_dossiertemplid_softdeltime_harddeltime_archivtime
tableName: dossier
changes:
- createIndex:
tableName: dossier
indexName: idx_dossiertemplid_softdeltime_harddeltime_archivtime
columns:
- column:
name: dossier_template_id
- column:
name: soft_deleted_time
- column:
name: hard_deleted_time
- column:
name: archived_time
- changeSet:
id: drop_truncated_idx_notification_preference_user_id_in_app_notifications_enabled
author: maverick
preConditions:
- onFail: MARK_RAN
- indexExists:
indexName: idx_notification_preference_user_id_in_app_notifications_enable
tableName: notification_preference
changes:
- dropIndex:
indexName: idx_notification_preference_user_id_in_app_notifications_enable
tableName: notification_preference
- changeSet:
id: create_idx_notify_pref_userid_app_notif_enabled
author: maverick
preConditions:
- onFail: MARK_RAN
- not:
indexExists:
indexName: idx_notify_pref_userid_app_notif_enabled
tableName: notification_preference
changes:
- createIndex:
tableName: notification_preference
indexName: idx_notify_pref_userid_app_notif_enabled
columns:
- column:
name: user_id
- column:
name: in_app_notifications_enabled

View File

@ -28,13 +28,13 @@ dependencies {
api("org.apache.logging.log4j:log4j-slf4j-impl:2.20.0")
api("net.logstash.logback:logstash-logback-encoder:7.4")
implementation("ch.qos.logback:logback-classic")
implementation("org.liquibase:liquibase-core:4.29.2") // Needed to be set explicit, otherwise spring dependency management sets it to 4.20.0
testImplementation("org.springframework.amqp:spring-rabbit-test:3.0.2")
testImplementation("org.springframework.security:spring-security-test:6.0.2")
testImplementation("org.testcontainers:postgresql:1.17.1")
testImplementation("org.springframework.boot:spring-boot-starter-test:3.0.4")
testImplementation("com.yannbriancon:spring-hibernate-query-utils:2.0.0")
testImplementation("com.google.protobuf:protobuf-java:4.27.1")
}
description = "persistence-service-server-v1"

View File

@ -135,11 +135,14 @@ public class Application implements ApplicationContextAware {
@Override
public KeyValues getHighCardinalityKeyValues(ServerRequestObservationContext context) {
// Make sure that KeyValues entries are already sorted by name for better performance
return super.getHighCardinalityKeyValues(context)
.and(getValueFromPathVariableOrRequestParam(context, "dossierId"),
getValueFromPathVariableOrRequestParam(context, "dossierTemplateId"),
getValueFromPathVariableOrRequestParam(context, "fileId"));
getValueFromPathVariableOrRequestParam(context, "fileId"))
.and(getValueFromRequestAttribute(context, "dossierId"),
getValueFromRequestAttribute(context, "dossierTemplateId"),
getValueFromRequestAttribute(context, "fileId"));
}
@ -164,6 +167,13 @@ public class Application implements ApplicationContextAware {
.orElse(""));
}
private KeyValue getValueFromRequestAttribute(ServerRequestObservationContext context, String name) {
Object value = context.getCarrier().getAttribute(name);
return KeyValue.of(name, value == null ? "" : value.toString());
}
};
}

View File

@ -0,0 +1,94 @@
package com.iqser.red.service.peristence.v1.server;
import java.lang.reflect.Type;
import java.util.Map;
import org.springframework.core.MethodParameter;
import org.springframework.http.HttpInputMessage;
import org.springframework.http.converter.HttpMessageConverter;
import org.springframework.web.bind.annotation.ControllerAdvice;
import org.springframework.web.context.request.RequestContextHolder;
import org.springframework.web.context.request.ServletRequestAttributes;
import org.springframework.web.servlet.mvc.method.annotation.RequestBodyAdvice;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import jakarta.servlet.http.HttpServletRequest;
import lombok.NonNull;
import lombok.extern.slf4j.Slf4j;
@Slf4j
@ControllerAdvice
public class GenericRequestBodyAdvice implements RequestBodyAdvice {
private final ObjectMapper objectMapper = new ObjectMapper();
@Override
public boolean supports(@NonNull MethodParameter methodParameter, @NonNull Type targetType, @NonNull Class<? extends HttpMessageConverter<?>> converterType) {
return true;
}
@Override
public @NonNull HttpInputMessage beforeBodyRead(@NonNull HttpInputMessage inputMessage,
@NonNull MethodParameter parameter,
@NonNull Type targetType,
@NonNull Class<? extends HttpMessageConverter<?>> converterType) {
return inputMessage;
}
@Override
public @NonNull Object afterBodyRead(@NonNull Object body,
@NonNull HttpInputMessage inputMessage,
@NonNull MethodParameter parameter,
@NonNull Type targetType,
@NonNull Class<? extends HttpMessageConverter<?>> converterType) {
ServletRequestAttributes attrs = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
if (attrs == null) {
return body;
}
HttpServletRequest request = attrs.getRequest();
try {
Map<String, Object> bodyMap = objectMapper.convertValue(body, new TypeReference<>() {
});
storeFieldIfPresent(bodyMap, "dossierId", request);
storeFieldIfPresent(bodyMap, "dossierTemplateId", request);
storeFieldIfPresent(bodyMap, "fileId", request);
} catch (Exception ignored) {
}
return body;
}
@Override
public Object handleEmptyBody(Object body,
@NonNull HttpInputMessage inputMessage,
@NonNull MethodParameter parameter,
@NonNull Type targetType,
@NonNull Class<? extends HttpMessageConverter<?>> converterType) {
return body;
}
private void storeFieldIfPresent(Map<String, Object> bodyMap, String fieldName, HttpServletRequest request) {
if (bodyMap.containsKey(fieldName)) {
Object value = bodyMap.get(fieldName);
if (value != null) {
request.setAttribute(fieldName, value.toString());
}
}
}
}

View File

@ -9,6 +9,9 @@ tenant-user-management-service.url: "http://tenant-user-management-service:8080/
logging.pattern.level: "%5p [${spring.application.name},%X{traceId:-},%X{spanId:-}]"
documine:
components:
filesLimit: 150
logging.type: ${LOGGING_TYPE:CONSOLE}
kubernetes.namespace: ${NAMESPACE:default}
@ -139,7 +142,8 @@ fforesight:
ignored-endpoints: [ '/redaction-gateway-v1', '/actuator/health/**',"/redaction-gateway-v1/websocket","/redaction-gateway-v1/websocket/**", '/redaction-gateway-v1/async/download/with-ott/**',
'/internal-api/**', '/redaction-gateway-v1/docs/swagger-ui',
'/redaction-gateway-v1/docs/**','/redaction-gateway-v1/docs',
'/api', '/api/','/api/docs/**','/api/docs','/api/docs/swagger-ui' ]
'/api', '/api/','/api/docs/**','/api/docs','/api/docs/swagger-ui',
'/actuator/prometheus']
enabled: true
springdoc:
base-path: '/api'

View File

@ -1,13 +1,18 @@
package com.iqser.red.service.peristence.v1.server.integration.tests;
import static org.junit.Assert.assertThrows;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyBoolean;
import static org.mockito.ArgumentMatchers.anyString;
import static org.mockito.Mockito.doAnswer;
import static org.mockito.Mockito.when;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;
import org.apache.commons.lang3.StringUtils;
import org.junit.jupiter.api.Test;
@ -18,8 +23,8 @@ import com.iqser.red.service.peristence.v1.server.integration.client.FileClient;
import com.iqser.red.service.peristence.v1.server.integration.service.DossierTemplateTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.service.DossierTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.service.FileTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.service.TypeProvider;
import com.iqser.red.service.peristence.v1.server.integration.utils.AbstractPersistenceServerServiceTest;
import com.iqser.red.service.persistence.management.v1.processor.acl.custom.dossier.DossierACLService;
import com.iqser.red.service.persistence.management.v1.processor.entity.configuration.LegalBasisEntity;
import com.iqser.red.service.persistence.management.v1.processor.service.persistence.LegalBasisMappingPersistenceService;
import com.iqser.red.service.persistence.service.v1.api.shared.model.DossierTemplateModel;
@ -33,6 +38,8 @@ import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemp
import com.iqser.red.service.persistence.service.v1.api.shared.model.warning.ApproveResponse;
import com.iqser.red.service.persistence.service.v1.api.shared.model.warning.WarningType;
import feign.FeignException;
public class ApprovalTest extends AbstractPersistenceServerServiceTest {
@Autowired
@ -44,15 +51,15 @@ public class ApprovalTest extends AbstractPersistenceServerServiceTest {
@Autowired
private DossierTesterAndProvider dossierTesterAndProvider;
@Autowired
private TypeProvider typeProvider;
@Autowired
private FileClient fileClient;
@SpyBean
private LegalBasisMappingPersistenceService legalBasisMappingPersistenceService;
@SpyBean
private DossierACLService dossierACLService;
@Test
public void testApprovalNoWarnings() {
@ -181,4 +188,54 @@ public class ApprovalTest extends AbstractPersistenceServerServiceTest {
assertTrue(approveResponse.getFileWarnings().isEmpty());
}
@Test
void testApprovalWhenDossierHasNoOwner() {
DossierTemplateModel dossierTemplateModel = dossierTemplateTesterAndProvider.provideTestTemplate();
Dossier dossier = dossierTesterAndProvider.provideTestDossier(dossierTemplateModel);
FileStatus file = fileTesterAndProvider.testAndProvideFile(dossier, "some-file");
fileTesterAndProvider.markFileAsProcessed(dossier.getId(), file.getFileId());
EntityLog entityLog = new EntityLog();
when(entityLogService.getEntityLog(anyString(), anyString(), anyBoolean())).thenReturn(entityLog);
List<com.iqser.red.service.persistence.management.v1.processor.service.users.model.User> allUsers = new ArrayList<>();
allUsers.add(com.iqser.red.service.persistence.management.v1.processor.service.users.model.User.builder()
.userId("manageradmin1@test.com")
.email("manageradmin1@test.com")
.isActive(true)
.roles(Set.of(getAllRoles()))
.build());
allUsers.add(com.iqser.red.service.persistence.management.v1.processor.service.users.model.User.builder()
.userId("manageradmin2@test.com")
.email("manageradmin2@test.com")
.isActive(true)
.roles(Set.of("RED_USER"))
.build());
when(usersClient.getAllUsers(false)).thenReturn(allUsers);
when(usersClient.getAllUsers(true)).thenReturn(allUsers);
doAnswer(invocation -> {
Dossier arg = invocation.getArgument(0);
if (dossier.getId().equals(arg.getId())) {
Dossier emptyDossier = new Dossier();
emptyDossier.setId(arg.getId());
return emptyDossier;
} else {
return invocation.callRealMethod();
}
}).when(dossierACLService).enhanceDossierWithACLData(any(Dossier.class));
FeignException ex = assertThrows(FeignException.Conflict.class, () -> {
fileClient.setStatusApproved(dossier.getId(), file.getFileId(), false);
});
assertTrue(ex.getMessage().contains("Dossier has no owner!"));
}
}

View File

@ -19,6 +19,7 @@ import org.mockito.MockedStatic;
import com.iqser.red.persistence.service.v1.external.api.impl.controller.StatusController;
import com.iqser.red.persistence.service.v2.external.api.impl.controller.ComponentControllerV2;
import com.iqser.red.service.persistence.management.v1.processor.exception.NotAllowedException;
import com.iqser.red.service.persistence.management.v1.processor.service.AccessControlService;
import com.iqser.red.service.persistence.management.v1.processor.service.ComponentLogService;
import com.iqser.red.service.persistence.management.v1.processor.service.CurrentApplicationTypeProvider;
import com.iqser.red.service.persistence.management.v1.processor.service.FileStatusService;
@ -59,6 +60,8 @@ public class ComponentControllerV2Test {
private ComponentLog mockComponentLog;
@Mock
private CurrentApplicationTypeProvider currentApplicationTypeProvider;
@Mock
private AccessControlService accessControlService;
@BeforeEach

View File

@ -11,6 +11,7 @@ import java.util.List;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import com.iqser.red.service.peristence.v1.server.integration.client.DossierTemplateClient;
import com.iqser.red.service.peristence.v1.server.integration.client.DossierTemplateExternalClient;
import com.iqser.red.service.peristence.v1.server.integration.service.DossierTemplateTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.utils.AbstractPersistenceServerServiceTest;
@ -19,6 +20,7 @@ import com.iqser.red.service.persistence.service.v1.api.shared.model.DossierTemp
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.ComponentDefinition;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.ComponentDefinitionAddRequest;
import com.iqser.red.service.persistence.service.v1.api.shared.model.component.ComponentDefinitionUpdateRequest;
import com.iqser.red.service.persistence.service.v1.api.shared.model.dossiertemplate.CloneDossierTemplateRequest;
import feign.FeignException;
@ -33,6 +35,9 @@ public class ComponentDefinitionTests extends AbstractPersistenceServerServiceTe
@Autowired
private ComponentDefinitionPersistenceService componentDefinitionPersistenceService;
@Autowired
private DossierTemplateClient dossierTemplateClient;
@Test
public void testCreateComponentDefinition() {
@ -294,4 +299,50 @@ public class ComponentDefinitionTests extends AbstractPersistenceServerServiceTe
assertEquals(newOrder.get(2).getRank(), 3);
}
@Test
public void testSoftDeleteComponentDefinitionAndReOrder() { //RED-10628
var dossierTemplate = dossierTemplateTesterAndProvider.provideTestTemplate();
var componentDefinitionAddRequest1 = ComponentDefinitionAddRequest.builder().technicalName("Component 1").displayName("Component 1").description("Description").build();
var componentDefinitionAddRequest2 = ComponentDefinitionAddRequest.builder().technicalName("Component 2").displayName("Component 2").description("Description").build();
var componentDefinitionAddRequest3 = ComponentDefinitionAddRequest.builder().technicalName("Component 3").displayName("Component 3").description("Description").build();
var componentDefinitionAddRequest4 = ComponentDefinitionAddRequest.builder().technicalName("Component 4").displayName("Component 4").description("Description").build();
var componentDefinitionAddRequest5 = ComponentDefinitionAddRequest.builder().technicalName("Component 5").displayName("Component 5").description("Description").build();
var componentDefinitionAddRequest6 = ComponentDefinitionAddRequest.builder().technicalName("Component 6").displayName("Component 6").description("Description").build();
var componentDefinitionAddRequest7 = ComponentDefinitionAddRequest.builder().technicalName("Component 7").displayName("Component 7").description("Description").build();
var response = dossierTemplateExternalClient.createComponents(dossierTemplate.getId(),
List.of(componentDefinitionAddRequest1, componentDefinitionAddRequest2, componentDefinitionAddRequest3,
componentDefinitionAddRequest4, componentDefinitionAddRequest5, componentDefinitionAddRequest6, componentDefinitionAddRequest7));
assertEquals(response.size(), 7);
// delete first component
dossierTemplateExternalClient.deleteComponents(dossierTemplate.getId(), List.of(response.get(0).getId()));
var softDeletedComponent = dossierTemplateExternalClient.getComponent(dossierTemplate.getId(), response.get(0).getId());
assertNotNull(softDeletedComponent.getSoftDeleteTime());
var response1 = dossierTemplateExternalClient.getComponents(dossierTemplate.getId(), false);
assertEquals(response1.size(), 6);
String firstComponentId = response1.get(0).getId();
String secondComponentId = response1.get(1).getId();
String thirdComponentId = response1.get(2).getId();
var newOrder = dossierTemplateExternalClient.reorderComponents(dossierTemplate.getId(), List.of(secondComponentId));
assertEquals(newOrder.size(), 6);
assertEquals(newOrder.get(0).getId(), secondComponentId);
assertEquals(newOrder.get(0).getRank(), 1);
assertEquals(newOrder.get(1).getId(), firstComponentId);
assertEquals(newOrder.get(1).getRank(), 2);
assertEquals(newOrder.get(2).getId(), thirdComponentId);
assertEquals(newOrder.get(2).getRank(), 3);
// clone dossier template
CloneDossierTemplateRequest cdtr = CloneDossierTemplateRequest.builder().name("Clone of " + dossierTemplate.getName()).cloningUserId("user").build();
var clonedDT = dossierTemplateClient.cloneDossierTemplate(dossierTemplate.getId(), cdtr);
assertEquals(clonedDT.getName(), "Clone of " + dossierTemplate.getName());
var response2 = dossierTemplateExternalClient.getComponents(clonedDT.getId(), false);
assertEquals(response2.size(), 6);
}
}

View File

@ -1,5 +1,6 @@
package com.iqser.red.service.peristence.v1.server.integration.tests;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.UTF_ENCODING;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.junit.jupiter.api.Assertions.assertTrue;
@ -56,7 +57,7 @@ public class ComponentMappingTest extends AbstractPersistenceServerServiceTest {
ComponentMappingMetadataModel componentMappingMetadataModel = dossierTemplateExternalClient.uploadMapping(dossierTemplate.getId(),
mockMultipartFile,
"file",
"UTF-8",
UTF_ENCODING,
",",
"\"");
@ -81,7 +82,7 @@ public class ComponentMappingTest extends AbstractPersistenceServerServiceTest {
componentMappingMetadataModel.getId(),
updateMockMultipartFile,
"file",
"UTF-8",
UTF_ENCODING,
",",
"\"");
@ -101,7 +102,7 @@ public class ComponentMappingTest extends AbstractPersistenceServerServiceTest {
IOUtils.toByteArray(new ClassPathResource("files/componentmapping/empty.csv").getInputStream()));
String id = dossierTemplate.getId();
var result = assertThrows(FeignException.class, () -> dossierTemplateExternalClient.uploadMapping(id, mockMultipartFile, "file", "UTF-8", ",", "\""));
var result = assertThrows(FeignException.class, () -> dossierTemplateExternalClient.uploadMapping(id, mockMultipartFile, "file", UTF_ENCODING, ",", "\""));
assertTrue(result.getMessage().contains("CSV file can not be empty!"));
}

View File

@ -9,17 +9,17 @@ import java.io.IOException;
import java.util.List;
import java.util.Set;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.ClassPathResource;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.iqser.red.persistence.service.v2.external.api.impl.controller.ComponentControllerV2;
import com.iqser.red.service.peristence.v1.server.integration.client.ComponentClient;
import com.iqser.red.service.peristence.v1.server.integration.client.DossierTemplateClient;
import com.iqser.red.service.peristence.v1.server.integration.client.FileClient;
import com.iqser.red.service.peristence.v1.server.integration.service.DossierTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.service.FileTesterAndProvider;
import com.iqser.red.service.peristence.v1.server.integration.service.UserProvider;
import com.iqser.red.service.peristence.v1.server.integration.utils.AbstractPersistenceServerServiceTest;
import com.iqser.red.service.persistence.management.v1.processor.service.FileService;
import com.iqser.red.service.persistence.service.v1.api.shared.model.analysislog.componentlog.ComponentLog;
@ -27,6 +27,7 @@ import com.iqser.red.service.persistence.service.v1.api.shared.model.component.R
import com.iqser.red.service.persistence.service.v2.api.external.model.Component;
import com.iqser.red.service.persistence.service.v2.api.external.model.ComponentValue;
import com.iqser.red.service.persistence.service.v2.api.external.model.EntityReference;
import com.knecon.fforesight.keycloakcommons.security.KeycloakSecurity;
import com.knecon.fforesight.tenantcommons.TenantApplicationType;
import feign.FeignException;
@ -53,7 +54,10 @@ public class ComponentOverrideTest extends AbstractPersistenceServerServiceTest
private FileService fileService;
@Autowired
private ComponentControllerV2 componentControllerV2;
private FileClient fileClient;
@Autowired
private UserProvider userProvider;
@Override
@ -72,6 +76,8 @@ public class ComponentOverrideTest extends AbstractPersistenceServerServiceTest
var file = fileTesterAndProvider.testAndProvideFile(dossier, "filename1");
fileClient.setStatusUnderReview(dossier.getId(), file.getId(), userProvider.getUserId());
System.out.println("DOSSIER TEMPLATE ID: " + dossierTemplate.getId());
System.out.println("DOSSIER ID: " + dossier.getId());
System.out.println("FILE ID: " + file.getId());
@ -193,6 +199,8 @@ public class ComponentOverrideTest extends AbstractPersistenceServerServiceTest
var file = fileTesterAndProvider.testAndProvideFile(dossier, "filename2");
fileClient.setStatusUnderReview(dossier.getId(), file.getId(), userProvider.getUserId());
System.out.println("DOSSIER TEMPLATE ID: " + dossierTemplate.getId());
System.out.println("DOSSIER ID: " + dossier.getId());
System.out.println("FILE ID: " + file.getId());
@ -255,6 +263,8 @@ public class ComponentOverrideTest extends AbstractPersistenceServerServiceTest
// case 2: re-upload file that was deleted before overrides should also not be returned anymore
fileTesterAndProvider.testAndProvideFile(dossier, "filename2");
fileClient.setStatusUnderReview(dossier.getId(), file.getId(), userProvider.getUserId());
var e = assertThrows(FeignException.class, () -> componentClient.getOverrides(dossierTemplate.getId(), dossier.getId(), file.getId()));
assertEquals(e.status(), 404);
fileManagementStorageService.saveComponentLog(dossier.getId(), file.getId(), componentLog);

View File

@ -1183,7 +1183,7 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
var type = dictionaryClient.addType(createTypeValue);
List<TypeValue> customTypes = dictionaryClient.getAllTypes(dossierTemplate.getId(), dossier.getId(), false).getTypes()
List<TypeValue> customTypes = dictionaryClient.getAllTypes(dossierTemplate.getId(), dossier.getId(), false).getTypes()
.stream()
.filter(t -> !t.isSystemManaged())
.toList();
@ -1213,7 +1213,7 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
@Test
void testRecreateTypeAndCheckMergedDictionary() {
void testRecreateTypeAndCheckMergedDictionary() {
var dossierTemplate = dossierTemplateTesterAndProvider.provideTestTemplate();
var dossier = dossierTesterAndProvider.provideTestDossier(dossierTemplate, "TestDossier");
@ -1240,7 +1240,10 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
UpdateEntries updateEntries = new UpdateEntries(newDossierEntries, allEntries);
dictionaryClient.updateEntries(type.getType(), type.getDossierTemplateId(), updateEntries, dossier.getId(), DictionaryEntryType.ENTRY);
List<DictionaryEntryEntity> deleted = entryRepository.findAll().stream().filter(DictionaryEntryEntity::isDeleted).toList();
List<DictionaryEntryEntity> deleted = entryRepository.findAll()
.stream()
.filter(DictionaryEntryEntity::isDeleted)
.toList();
assertEquals(3, deleted.size());
var updatedDossierDictionary = dictionaryClient.getDictionaryForType(type.getType(), dossierTemplate.getId(), dossier.getId());
@ -1257,7 +1260,10 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
assertEquals(2, deletedType1Entities.size());
deleted = entryRepository.findAll().stream().filter(DictionaryEntryEntity::isDeleted).toList();
deleted = entryRepository.findAll()
.stream()
.filter(DictionaryEntryEntity::isDeleted)
.toList();
assertEquals(2, deleted.size());
// recreation
@ -1265,7 +1271,10 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
dictionaryClient.addEntry(type2.getType(), type2.getDossierTemplateId(), templateEntries, false, null, DictionaryEntryType.ENTRY);
deleted = entryRepository.findAll().stream().filter(DictionaryEntryEntity::isDeleted).toList();
deleted = entryRepository.findAll()
.stream()
.filter(DictionaryEntryEntity::isDeleted)
.toList();
assertEquals(0, deleted.size());
deletedType1Entities = typeRepository.findAllTypesByDossierTemplateIdOrDossierId(dossierTemplate.getId(), dossier.getId())
@ -1292,6 +1301,72 @@ public class DictionaryTest extends AbstractPersistenceServerServiceTest {
}
@Test
public void testChangeFlagsSuccessful() {
var dossierTemplate = dossierTemplateTesterAndProvider.provideTestTemplate();
var dossier1 = dossierTesterAndProvider.provideTestDossier(dossierTemplate, "Dossier1");
var type1 = dictionaryClient.addType(CreateTypeValue.builder()
.type("test_change_flags_true")
.label("Test Change Flags True")
.hexColor("#123456")
.rank(100)
.hint(false)
.hasDictionary(true)
.caseInsensitive(false)
.recommendation(false)
.addToDictionaryAction(false)
.dossierTemplateId(dossierTemplate.getId())
.dossierDictionaryOnly(true)
.build());
dictionaryClient.getAllTypes(dossierTemplate.getId(), dossier1.getId(), false);
dictionaryClient.changeFlags(type1.getType(), dossierTemplate.getId(), dossier1.getId(), true);
String compositeTypeId1 = type1.getType() + ":" + dossierTemplate.getId() + ":" + dossier1.getId();
var updatedTypeEntity1 = typeRepository.findById(compositeTypeId1)
.orElseThrow(() -> new AssertionError("Type entity not found in repository"));
assertThat(updatedTypeEntity1.isAddToDictionaryAction()).isTrue();
}
@Test
public void testChangeFlagsUnsuccessful() {
var dossierTemplate = dossierTemplateTesterAndProvider.provideTestTemplate();
var dossier2 = dossierTesterAndProvider.provideTestDossier(dossierTemplate, "Dossier2");
var type2 = dictionaryClient.addType(CreateTypeValue.builder()
.type("test_change_flags_false")
.label("Test Change Flags False")
.hexColor("#654321")
.rank(101)
.hint(false)
.hasDictionary(true)
.caseInsensitive(false)
.recommendation(false)
.addToDictionaryAction(false)
.dossierTemplateId(dossierTemplate.getId())
.dossierDictionaryOnly(false)
.build());
dictionaryClient.getAllTypes(dossierTemplate.getId(), dossier2.getId(), false);
assertThatThrownBy(() -> dictionaryClient.changeFlags(type2.getType(), dossierTemplate.getId(), dossier2.getId(), true))
.isInstanceOf(FeignException.BadRequest.class)
.hasMessageContaining("The addToDictionaryAction flag can only be adjusted for dossierDictionaryOnly-types.");
String compositeTypeId2 = type2.getType() + ":" + dossierTemplate.getId() + ":" + dossier2.getId();
var typeEntity2 = typeRepository.findById(compositeTypeId2)
.orElseThrow(() -> new AssertionError("Type entity not found in repository"));
assertThat(typeEntity2.isAddToDictionaryAction()).isFalse();
}
private static final class ListContentWithoutOrderAndWithoutDuplicatesComparator<T> implements Comparator<List<? extends T>> {
@SuppressWarnings("SuspiciousMethodCalls")

View File

@ -51,15 +51,24 @@ public class DossierAttributeTest extends AbstractPersistenceServerServiceTest {
attribute.setEditable(true);
attribute.setType(DossierAttributeType.TEXT);
dossierAttributeConfigClient.addOrUpdateDossierAttributeConfig(dossier.getDossierTemplateId(), attribute);
var created = dossierAttributeConfigClient.addOrUpdateDossierAttributeConfig(dossier.getDossierTemplateId(), attribute);
assertThat(created.getPlaceholder()).isEqualTo("{{dossier.attribute.Test}}");
loadedAttributes = dossierAttributeConfigClient.getDossierAttributesConfig(dossier.getDossierTemplateId());
assertThat(loadedAttributes.getDossierAttributeConfigs()).isNotEmpty();
assertThat(loadedAttributes.getDossierAttributeConfigs().get(0).getDossierTemplateId()).isEqualTo(dossier.getDossierTemplateId());
attribute.setId(created.getId());
attribute.setLabel("updated test");
var updated = dossierAttributeConfigClient.addOrUpdateDossierAttributeConfig(dossier.getDossierTemplateId(), attribute);
assertThat(updated.getLabel()).isEqualTo("updated test");
assertThat(updated.getPlaceholder()).isEqualTo("{{dossier.attribute.UpdatedTest}}");
attribute.setPlaceholder("{{dossier.attribute.MyOwnPlaceholder}}");
updated = dossierAttributeConfigClient.addOrUpdateDossierAttributeConfig(dossier.getDossierTemplateId(), attribute);
assertThat(updated.getLabel()).isEqualTo("updated test");
assertThat(updated.getPlaceholder()).isEqualTo("{{dossier.attribute.MyOwnPlaceholder}}");
dossierAttributeConfigClient.setDossierAttributesConfig(dossier.getDossierTemplateId(), new DossierAttributesConfig());
loadedAttributes = dossierAttributeConfigClient.getDossierAttributesConfig(dossier.getDossierTemplateId());
assertThat(loadedAttributes.getDossierAttributeConfigs()).isEmpty();

View File

@ -4,6 +4,7 @@ import static org.assertj.core.api.Assertions.assertThat;
import static org.mockito.Mockito.when;
import java.io.ByteArrayInputStream;
import java.time.OffsetDateTime;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@ -57,6 +58,7 @@ import org.junit.jupiter.api.Test;
public class DownloadPreparationTest extends AbstractPersistenceServerServiceTest {
protected static final String USER_ID = "1";
@Autowired
DownloadReportMessageReceiver downloadReportMessageReceiver;
@ -87,8 +89,6 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
@Autowired
FileClient fileClient;
DossierWithSingleFile testData;
@Autowired
DownloadReadyJob downloadReadyJob;
@ -98,14 +98,13 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
@Autowired
DownloadCompressionMessageReceiver downloadCompressionMessageReceiver;
DossierWithSingleFile testData;
@BeforeEach
public void createTestData() {
testData = new DossierWithSingleFile();
}
@Test
@SneakyThrows
public void testReceiveDownloadPackage() {
@ -159,7 +158,7 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
when(this.tenantsClient.getTenants()).thenReturn(List.of(TenantResponse.builder().tenantId("redaction").details(Map.of("persistence-service-ready", true)).build()));
downloadReadyJob.execute(null); // Will be called by scheduler in prod.
var firstStatus = getFirstStatus();
assertThat(getFirstStatus().getStatus()).isEqualTo(DownloadStatusValue.COMPRESSING);
assertThat(firstStatus.getStatus()).isEqualTo(DownloadStatusValue.COMPRESSING);
downloadCompressionMessageReceiver.receive(DownloadJob.builder().storageId(firstStatus.getStorageId()).build());
firstStatus = getFirstStatus();
@ -169,6 +168,133 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
clearTenantContext();
}
@Test
@SneakyThrows
public void testLastDownloadSetForApprovedFile() {
// Arrange
testData.forwardFileToApprovedState();
uploadMockReportTemplate(testData);
var availableTemplates = reportTemplateClient.getAvailableReportTemplates(testData.getDossierTemplateId());
assertThat(availableTemplates).isNotEmpty();
createDossierInService(testData, availableTemplates);
// Act
downloadClient.prepareDownload(PrepareDownloadWithOptionRequest.builder()
.dossierId(testData.getDossierId())
.downloadFileTypes(Set.of(DownloadFileType.ORIGINAL))
.fileIds(Collections.singletonList(testData.file.getId()))
.redactionPreviewColor("#aaaaaa")
.build());
// Trigger the download processing job
setupTenantContext();
when(this.tenantsClient.getTenants()).thenReturn(List.of(TenantResponse.builder().tenantId("redaction").details(Map.of("persistence-service-ready", true)).build()));
downloadReadyJob.execute(null);
clearTenantContext();
// Assert
FileStatus persistedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(persistedFileStatus.getLastDownload()).isNotNull();
}
@Test
@SneakyThrows
public void testLastDownloadNotSetForNonApprovedFile() {
// Arrange
testData.forwardFile();
uploadMockReportTemplate(testData);
var availableTemplates = reportTemplateClient.getAvailableReportTemplates(testData.getDossierTemplateId());
assertThat(availableTemplates).isNotEmpty();
createDossierInService(testData, availableTemplates);
// Act
downloadClient.prepareDownload(PrepareDownloadWithOptionRequest.builder()
.dossierId(testData.getDossierId())
.downloadFileTypes(Set.of(DownloadFileType.ORIGINAL))
.fileIds(Collections.singletonList(testData.file.getId()))
.redactionPreviewColor("#aaaaaa")
.build());
// Trigger the download processing job
setupTenantContext();
when(this.tenantsClient.getTenants()).thenReturn(List.of(TenantResponse.builder().tenantId("redaction").details(Map.of("persistence-service-ready", true)).build()));
downloadReadyJob.execute(null);
clearTenantContext();
// Assert
FileStatus persistedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(persistedFileStatus.getLastDownload()).isNull();
}
@Test
@SneakyThrows
public void testLastDownloadResetWhenStatusChangedFromApprovedToDifferent() {
// Arrange
testData.forwardFileToApprovedState();
uploadMockReportTemplate(testData);
var availableTemplates = reportTemplateClient.getAvailableReportTemplates(testData.getDossierTemplateId());
assertThat(availableTemplates).isNotEmpty();
createDossierInService(testData, availableTemplates);
// Prepare and process download to set 'last_download'
downloadClient.prepareDownload(PrepareDownloadWithOptionRequest.builder()
.dossierId(testData.getDossierId())
.downloadFileTypes(Set.of(DownloadFileType.ORIGINAL))
.fileIds(Collections.singletonList(testData.file.getId()))
.redactionPreviewColor("#aaaaaa")
.build());
setupTenantContext();
when(this.tenantsClient.getTenants()).thenReturn(List.of(TenantResponse.builder().tenantId("redaction").details(Map.of("persistence-service-ready", true)).build()));
downloadReadyJob.execute(null);
clearTenantContext();
// Verify that 'last_download' is set
FileStatus persistedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(persistedFileStatus.getLastDownload()).isNotNull();
// Change status from approved to reviewed
fileClient.setStatusUnderReview(testData.getDossierId(), testData.getFileId(), null);
// Assert that 'last_download' is reset
FileStatus updatedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(updatedFileStatus.getLastDownload()).isNull();
}
@Test
@SneakyThrows
public void testLastDownloadRemainsWhenStatusChangedFromApprovedToApproved() {
// Arrange
testData.forwardFileToApprovedState();
uploadMockReportTemplate(testData);
var availableTemplates = reportTemplateClient.getAvailableReportTemplates(testData.getDossierTemplateId());
assertThat(availableTemplates).isNotEmpty();
createDossierInService(testData, availableTemplates);
// Prepare and process download to set 'last_download'
downloadClient.prepareDownload(PrepareDownloadWithOptionRequest.builder()
.dossierId(testData.getDossierId())
.downloadFileTypes(Set.of(DownloadFileType.ORIGINAL))
.fileIds(Collections.singletonList(testData.file.getId()))
.redactionPreviewColor("#aaaaaa")
.build());
setupTenantContext();
when(this.tenantsClient.getTenants()).thenReturn(List.of(TenantResponse.builder().tenantId("redaction").details(Map.of("persistence-service-ready", true)).build()));
downloadReadyJob.execute(null);
clearTenantContext();
// Verify that 'last_download' is set
FileStatus persistedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(persistedFileStatus.getLastDownload()).isNotNull();
// Change status from approved to approved
fileClient.setStatusApproved(testData.getDossierId(), testData.getFileId(), true);
// Assert that 'last_download' remains set
FileStatus updatedFileStatus = fileClient.getFileStatus(testData.getDossierId(), testData.getFileId());
assertThat(updatedFileStatus.getLastDownload()).isNotNull();
}
private void createDossierInService(DossierWithSingleFile testData, List<ReportTemplate> availableTemplates) {
@ -189,18 +315,16 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
.build());
}
private void uploadMockReportTemplate(DossierWithSingleFile testData) {
var template = new MockMultipartFile("test.docx", "zzz".getBytes());
reportTemplateClient.uploadTemplate(template, testData.getDossierTemplateId(), true, true);
}
@SneakyThrows
private void addStoredFileInformationToStorage(FileStatus file, List<ReportTemplate> availableTemplates, String downloadId) {
var storedFileInformationstorageId = downloadId.substring(0, downloadId.length() - 3) + "/REPORT_INFO.json";
var storedFileInformationStorageId = downloadId.substring(0, downloadId.length() - 3) + "/REPORT_INFO.json";
String reportStorageId = "XYZ";
var sivList = new ArrayList<StoredFileInformation>();
@ -210,11 +334,10 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
siv.setTemplateId(availableTemplates.iterator().next().getTemplateId());
sivList.add(siv);
storageService.storeObject(TenantContext.getTenantId(), storedFileInformationstorageId, new ByteArrayInputStream(new ObjectMapper().writeValueAsBytes(sivList)));
storageService.storeObject(TenantContext.getTenantId(), storedFileInformationStorageId, new ByteArrayInputStream(new ObjectMapper().writeValueAsBytes(sivList)));
storageService.storeObject(TenantContext.getTenantId(), reportStorageId, new ByteArrayInputStream(new byte[]{1, 2, 3, 4}));
}
private DownloadStatus getFirstStatus() {
List<DownloadStatus> finalDownloadStatuses = downloadClient.getDownloadStatus().getDownloadStatus();
@ -232,21 +355,15 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
FileStatus file = fileTesterAndProvider.testAndProvideFile(dossier);
public String getDossierTemplateId() {
return dossierTemplate.getId();
}
public String getDossierId() {
return dossier.getId();
}
public String getFileId() {
return file.getFileId();
}
@ -260,6 +377,12 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
assertThatTestFileIsApproved();
}
public void forwardFile() {
fileTesterAndProvider.markFileAsProcessed(getDossierId(), getFileId());
assertThatTestFileIsNew();
}
private void assertThatTestFileIsApproved() {
@ -267,6 +390,12 @@ public class DownloadPreparationTest extends AbstractPersistenceServerServiceTes
assertThat(persistedFileStatus.getWorkflowStatus()).isEqualTo(WorkflowStatus.APPROVED);
}
private void assertThatTestFileIsNew() {
var persistedFileStatus = fileClient.getFileStatus(getDossierId(), file.getId());
assertThat(persistedFileStatus.getWorkflowStatus()).isEqualTo(WorkflowStatus.NEW);
}
}
}

View File

@ -1,7 +1,8 @@
package com.iqser.red.service.peristence.v1.server.integration.tests;
import static com.iqser.red.service.persistence.management.v1.processor.service.FileAttributesManagementService.ASCII_ENCODING;
import static com.iqser.red.service.persistence.management.v1.processor.service.FileAttributesManagementService.UTF_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ASCII_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.ISO_ENCODING;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.UTF_ENCODING;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertThrows;
@ -73,7 +74,7 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
var generalConfig = new FileAttributesConfig();
generalConfig.setDelimiter(",");
generalConfig.setEncoding("UTF-8");
generalConfig.setEncoding(UTF_ENCODING);
generalConfig.setFilenameMappingColumnHeaderName("Name");
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig);
@ -87,9 +88,10 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
assertThat(loadedConfig.getEncoding()).isEqualTo(ASCII_ENCODING);
List<FileAttributeConfig> configs = new ArrayList<>();
String myOwnPlaceholder = "{{file.attribute.MyOwnPlaceholder}}";
configs.add(FileAttributeConfig.builder().csvColumnHeader("Name").primaryAttribute(true).label("Name").build());
configs.add(FileAttributeConfig.builder().csvColumnHeader("Attribute A").primaryAttribute(true).label("Attribute A").build());
FileAttributeConfig attributeB = FileAttributeConfig.builder().csvColumnHeader("Attribute B").primaryAttribute(true).label("Attribute B").build();
FileAttributeConfig attributeB = FileAttributeConfig.builder().csvColumnHeader("Attribute B").primaryAttribute(true).label("Attribute B").placeholder(myOwnPlaceholder).includeInCsvExport(true).build();
configs.add(attributeB);
configs.add(FileAttributeConfig.builder().csvColumnHeader("Attribute C").primaryAttribute(false).label("Attribute C").build());
configs.add(FileAttributeConfig.builder().csvColumnHeader("Attribute D").primaryAttribute(false).label("Attribute D").build());
@ -103,6 +105,8 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
.findAny()
.get();
assertThat(primaryAttribute.getLabel()).isEqualTo(attributeB.getLabel());
assertThat(primaryAttribute.getPlaceholder()).isEqualTo(attributeB.getPlaceholder());
assertTrue(primaryAttribute.isIncludeInCsvExport());
fileAttributeConfigClient.deleteFileAttribute(dossier.getDossierTemplateId(),
loadedConfigs.stream()
@ -121,6 +125,7 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
loadedConfigs = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId()).getFileAttributeConfigs();
assertThat(loadedConfigs.size()).isEqualTo(3);
assertThat(loadedConfigs.get(0).getDossierTemplateId()).isEqualTo(dossier.getDossierTemplateId());
FileAttributeConfig newConfig = new FileAttributeConfig();
newConfig.setPrimaryAttribute(true);
@ -128,15 +133,27 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
var created = fileAttributeConfigClient.addOrUpdateFileAttribute(dossier.getDossierTemplateId(), newConfig);
loadedConfigs = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId()).getFileAttributeConfigs();
assertThat(loadedConfigs.size()).isEqualTo(4);
assertThat(created.getPlaceholder()).isEqualTo("{{file.attribute.TestAttribute}}");
assertThat(created.getCsvColumnHeader()).isNull();
newConfig.setId(created.getId());
newConfig.setLabel("Test Attribute Update");
newConfig.setIncludeInCsvExport(true);
var updated = fileAttributeConfigClient.addOrUpdateFileAttribute(dossier.getDossierTemplateId(), newConfig);
loadedConfigs = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId()).getFileAttributeConfigs();
assertThat(loadedConfigs.size()).isEqualTo(4);
assertThat(updated.getLabel()).isEqualTo("Test Attribute Update");
assertTrue(updated.isIncludeInCsvExport());
assertThat(updated.getPlaceholder()).isEqualTo("{{file.attribute.TestAttributeUpdate}}");
newConfig.setPlaceholder(myOwnPlaceholder);
var e = assertThrows(FeignException.class,
() -> fileAttributeConfigClient.addOrUpdateFileAttribute(dossier.getDossierTemplateId(), newConfig));
assertEquals(409, e.status());
newConfig.setPlaceholder("{{file.attribute.TestAttributeUpdate2}}");
updated = fileAttributeConfigClient.addOrUpdateFileAttribute(dossier.getDossierTemplateId(), newConfig);
assertThat(updated.getPlaceholder()).isEqualTo("{{file.attribute.TestAttributeUpdate2}}");
assertThat(fileClient.getFileStatus(dossier.getId(), file.getId()).getLastFileAttributeChange()).isNull();
fileAttributeClient.setFileAttributes(dossier.getId(), file.getId(), new FileAttributes(Map.of(updated.getId(), "Lorem Ipsum")));
@ -196,7 +213,7 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
var generalConfig = new FileAttributesConfig();
generalConfig.setDelimiter(",");
generalConfig.setEncoding("UTF-8");
generalConfig.setEncoding(UTF_ENCODING);
generalConfig.setFilenameMappingColumnHeaderName("Name");
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig);
@ -248,7 +265,7 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
var generalConfig = new FileAttributesConfig();
generalConfig.setDelimiter(",");
generalConfig.setEncoding("UTF-8");
generalConfig.setEncoding(UTF_ENCODING);
generalConfig.setFilenameMappingColumnHeaderName("Name");
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig);
@ -324,7 +341,7 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
var generalConfig = new FileAttributesConfig();
generalConfig.setDelimiter(",");
generalConfig.setEncoding("UTF-8");
generalConfig.setEncoding(UTF_ENCODING);
generalConfig.setFilenameMappingColumnHeaderName("Path");
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig);
@ -374,4 +391,104 @@ public class FileAttributeTest extends AbstractPersistenceServerServiceTest {
assertTrue(result.getMessage().contains("Invalid CSV file format: Unterminated quoted field at end of CSV line. Beginning of lost text: [4.636.0,4.363.0,4.363.0\\n]") || result.getMessage().contains("Invalid CSV file format: Unterminiertes Anführungszeichen am Ende einer CSV-Zeile. Anfang des verlorenen Textes: [4.636.0,4.363.0,4.363.0\\n]"));
}
@Test
public void testParsingEncoding() {
var dossier = dossierTesterAndProvider.provideTestDossier();
var generalConfig = new FileAttributesConfig();
generalConfig.setDelimiter(",");
generalConfig.setEncoding("ISO");
generalConfig.setFilenameMappingColumnHeaderName("Name");
var e = assertThrows(FeignException.class,
() -> fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig));
assertEquals(400, e.status());
generalConfig.setEncoding(ISO_ENCODING);
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), generalConfig);
var loadedConfig = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId());
assertThat(loadedConfig.getEncoding()).isEqualTo(ISO_ENCODING);
}
@SneakyThrows
@Test
public void testUpdatePrimaryAttributeAndVerify() {
var dossier = dossierTesterAndProvider.provideTestDossier();
var file = fileTesterAndProvider.testAndProvideFile(dossier);
List<FileAttributeConfig> initialConfigs = new ArrayList<>();
initialConfigs.add(FileAttributeConfig.builder()
.csvColumnHeader("Attribute A")
.primaryAttribute(true)
.label("Attribute A")
.build());
initialConfigs.add(FileAttributeConfig.builder()
.csvColumnHeader("Attribute B")
.primaryAttribute(false)
.label("Attribute B")
.build());
initialConfigs.add(FileAttributeConfig.builder()
.csvColumnHeader("Attribute C")
.primaryAttribute(false)
.label("Attribute C")
.build());
FileAttributesConfig initialConfig = new FileAttributesConfig();
initialConfig.setDelimiter(",");
initialConfig.setEncoding(UTF_ENCODING);
initialConfig.setFilenameMappingColumnHeaderName("Name");
initialConfig.setFileAttributeConfigs(initialConfigs);
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), initialConfig);
var loadedInitialConfig = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId());
assertThat(loadedInitialConfig.getFileAttributeConfigs()).hasSize(3);
FileAttributeConfig initialPrimary = loadedInitialConfig.getFileAttributeConfigs().stream()
.filter(FileAttributeConfig::isPrimaryAttribute)
.findFirst()
.orElseThrow(() -> new AssertionError("No primary attribute found initially"));
assertThat(initialPrimary.getLabel()).isEqualTo("Attribute A");
List<FileAttributeConfig> updatedConfigs = loadedInitialConfig.getFileAttributeConfigs().stream()
.map(config -> {
if (config.getLabel().equals("Attribute B")) {
config.setPrimaryAttribute(true);
} else if (config.getLabel().equals("Attribute A")) {
config.setPrimaryAttribute(false);
}
return config;
})
.collect(Collectors.toList());
FileAttributesConfig updatedConfig = new FileAttributesConfig();
updatedConfig.setDelimiter(loadedInitialConfig.getDelimiter());
updatedConfig.setEncoding(loadedInitialConfig.getEncoding());
updatedConfig.setFilenameMappingColumnHeaderName(loadedInitialConfig.getFilenameMappingColumnHeaderName());
updatedConfig.setFileAttributeConfigs(updatedConfigs);
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), updatedConfig);
var loadedUpdatedConfig = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId());
assertThat(loadedUpdatedConfig.getFileAttributeConfigs()).hasSize(3);
FileAttributeConfig newPrimary = loadedUpdatedConfig.getFileAttributeConfigs().stream()
.filter(FileAttributeConfig::isPrimaryAttribute)
.findFirst()
.orElseThrow(() -> new AssertionError("No primary attribute found after update"));
assertThat(newPrimary.getLabel()).isEqualTo("Attribute B");
fileAttributeConfigClient.setFileAttributesConfig(dossier.getDossierTemplateId(), loadedUpdatedConfig);
var reloadedConfig = fileAttributeConfigClient.getFileAttributesConfiguration(dossier.getDossierTemplateId());
assertThat(reloadedConfig.getFileAttributeConfigs()).hasSize(3);
FileAttributeConfig verifiedPrimary = reloadedConfig.getFileAttributeConfigs().stream()
.filter(FileAttributeConfig::isPrimaryAttribute)
.findFirst()
.orElseThrow(() -> new AssertionError("No primary attribute found after reapplying configuration"));
assertThat(verifiedPrimary.getLabel()).isEqualTo("Attribute B");
}
}

View File

@ -1,5 +1,6 @@
package com.iqser.red.service.peristence.v1.server.integration.tests;
import static com.iqser.red.service.persistence.service.v1.api.external.resource.FileAttributesResource.UTF_ENCODING;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
@ -96,10 +97,10 @@ import com.iqser.red.service.persistence.service.v1.api.shared.model.manual.Lega
import com.iqser.red.service.persistence.service.v1.api.shared.model.manual.RecategorizationRequestModel;
import com.iqser.red.service.persistence.service.v1.api.shared.model.manual.RemoveRedactionRequestModel;
import com.iqser.red.service.persistence.service.v1.api.shared.model.notification.NotificationType;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPageProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentPositionDataProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentStructureProto;
import com.knecon.fforesight.service.layoutparser.internal.api.data.redaction.DocumentTextDataProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentPageProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentPositionDataProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentStructureProto;
import com.iqser.red.service.redaction.v1.server.data.DocumentTextDataProto;
import com.knecon.fforesight.tenantcommons.TenantContext;
import com.knecon.fforesight.tenantcommons.model.TenantResponse;
@ -765,7 +766,7 @@ public class FileTest extends AbstractPersistenceServerServiceTest {
fileManagementStorageService.storeObject(dossier.getId(), fileId, FileType.UNTOUCHED, new ByteArrayInputStream("test".getBytes(StandardCharsets.UTF_8)));
when(fileAttributeConfigPersistenceService.getFileAttributesGeneralConfiguration(anyString())).thenReturn(FileAttributesGeneralConfigurationEntity.builder()
.delimiter(",")
.encoding("UTF-8")
.encoding(UTF_ENCODING)
.build());
when(fileAttributeConfigPersistenceService.getFileAttributes(anyString())).thenReturn(Collections.emptyList());
assertThrows(FeignException.class, () -> uploadClient.upload(malformedCsvFile, dossier.getId(), false, false));

View File

@ -56,7 +56,7 @@ public class ReanalysisTest extends AbstractPersistenceServerServiceTest {
assertThat(loadedFile.getProcessingStatus()).isEqualTo(ProcessingStatus.OCR_PROCESSING_QUEUED);
resetProcessingStatus(file);
reanalysisClient.ocrFile(dossier.getId(), file.getId(), true);
reanalysisClient.ocrFile(dossier.getId(), file.getId(), true, false);
loadedFile = fileClient.getFileStatus(dossier.getId(), file.getId());
assertThat(loadedFile.getProcessingStatus()).isEqualTo(ProcessingStatus.OCR_PROCESSING_QUEUED);
resetProcessingStatus(file);

Some files were not shown because too many files have changed in this diff Show More