* feat: utility to get file size
* feat: if backup file is greater than 1gb then consider latest backup site instead of taking new
* fix: remove unwanted import
* chore: fix condition inside offsite backup utils
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: change email field to notify_email
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: fix deepsource issues
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* feat: add offsite_backup support for google drive
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: get recipients from within send_email
pass email_field to send_email instead of calling two functions
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: set flag within validate_file_size
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* feat: get latest file backup when specified
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: fix deepsource issues
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: fix incorrectly spelled dropbox settings
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: implement file backup logic for aws s3
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: fix deepsource issues
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* chore: fix deepsource issues
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
Co-authored-by: Chinmay D. Pai <chinmaydpai@gmail.com>
Co-authored-by: Suraj Shetty <13928957+surajshetty3416@users.noreply.github.com>
currently, there is no way to reset password for those logging in
through ldap. i understand that this shouldn't really be handled by
erpnext, but there are people that have requested resetting the ldap
password from with the instance itself, and hence, we'll let that happen
now.
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
* fix: missing bucket name check when user make changes on s3 backup setting
Problem: If the user changed the backup limit, then press save. The validate function will give an error message that the bucket name already exists. It would be inconvenient for the user to use a different bucket name to save any changes.
So I implemented a flag ''bucket_name_exist'' to indicate if the bucket name exists, if not, then go to the flow of trying to create a bucket.
* fix: extra line removed
* fix: Use head_bucket Boto3 API
1. Head_bucket will return 200 Ok if the bucket exists and you have permission to access it.
2. Error code 403 Forbidden, Error code 404 Not Found.
3. Use bucket_name_exist to check if need to create bucket.
Thanks @Mangesh-Khairnar suggestions.
Reference:
1. https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.head_bucket
2. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/migrations3.html#accessing-a-bucket
Further improvements:
1. Use 'GET' Requestion to check if the Access Key ID and Secret Access Key is valid (Because AWS tier has 20,000 GET requests while only 2,000 LIST requests)
2. Edge case on Error prompts when head_bucket returns an error code other than '403' or '404'.
* fix: add an extra line to align with original code style
* Update frappe/integrations/doctype/s3_backup_settings/s3_backup_settings.py
Co-Authored-By: Chinmay Pai <chinmaydpai@gmail.com>
* fix: remove the flag and redundant exception
When we got a 404 error, we can just create the bucket as 1. The bucket name does not exist in the current bucket. 2. we have permission to access the bucket.
* Update frappe/integrations/doctype/s3_backup_settings/s3_backup_settings.py
Co-Authored-By: Himanshu <himanshuwarekar@yahoo.com>
* Update frappe/integrations/doctype/s3_backup_settings/s3_backup_settings.py
Co-Authored-By: Chinmay Pai <chinmaydpai@gmail.com>
* fix: missing handling the error code '400' - 'Bad Request'
* fix: applied with DeepSource analysis
1. Lines too long
2. Doc Lines too long
3. Expected 2 blank lines between class and method
4. Unused variable removed
Co-authored-by: Chinmay Pai <chinmaydpai@gmail.com>
Co-authored-by: Himanshu <himanshuwarekar@yahoo.com>
* fix: better logging for slack
* chore: cleanup code and get from dict
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
Co-authored-by: Chinmay Pai <chinmaydpai@gmail.com>
* fix: sender is not set to current user
I am not sure if I did it right:
1. import frappe
2. check if the sender is None: using frappe.session.user to fetch email address for current user
* fix: update the correct field name for S3-backup-setting
The field variable name should be updated to 'notify_email' instead of 'notification_email'.
* style: remove a trailing whitespace
* fix: Assign notify_email instead of calling frappe.db.get_value twice
* chore: remove default sender and notify_email check
* default sender already gets set in queue.py, so there's no need for a
check inside __init__.py
* notify_email seems to be a mandatory field, so there's no need to
check if the field has a value (assuming that it always will).
Signed-off-by: Chinmay D. Pai <chinmaydpai@gmail.com>
Co-authored-by: Chinmay Pai <chinmaydpai@gmail.com>