Merge branch 'develop' into re-fix
This commit is contained in:
commit
b61258fbc0
252 changed files with 12635 additions and 3439 deletions
8
.github/CONTRIBUTING.md
vendored
8
.github/CONTRIBUTING.md
vendored
|
|
@ -1,12 +1,12 @@
|
|||
### Introduction (first timers)
|
||||
|
||||
Thank you for your interest in raising an Issue with ERPNext. An Issue could mean a bug report or a request for a missing feature. By raising a bug report, you are contributing to the development of ERPNext and this is the first step of participating in the community. Bug reports are very helpful for developers as they quickly fix the issue before other users start facing it.
|
||||
Thank you for your interest in raising an Issue with the Frappe Framework. An Issue could mean a bug report or a request for a missing feature. By raising a bug report, you are contributing to the development of the Frappe Framework and this is the first step of participating in the community. Bug reports are very helpful for developers as they quickly fix the issue before other users start facing it.
|
||||
|
||||
Feature requests are also a great way to take the product forward. New ideas can come in any user scenario and the issue list also acts a roadmap of future features.
|
||||
|
||||
When you are raising an Issue, you should keep a few things in mind. Remember that the developer does not have access to your machine so you must give all the information you can while raising an Issue. If you are suggesting a feature, you should be very clear about what you want.
|
||||
|
||||
The Issue list is not the right place to ask a question or start a general discussion. If you want to do that , then the right place is the forum [https://discuss.erpnext.com](https://discuss.erpnext.com).
|
||||
The Issue list is not the right place to ask a question or start a general discussion. If you want to do that , then the right place is the forum [https://discuss.frappe.io](https://discuss.frappe.io).
|
||||
|
||||
### Reply and Closing Policy
|
||||
|
||||
|
|
@ -15,8 +15,8 @@ If your issue is not clear or does not meet the guidelines, then it will be clos
|
|||
### General Issue Guidelines
|
||||
|
||||
1. **Search existing Issues:** Before raising a Issue, search if it has been raised before. Maybe add a 👍 or give additional help by creating a mockup if it is not already created.
|
||||
1. **Report each issue separately:** Don't club multiple, unreleated issues in one note.
|
||||
1. **Brief:** Please don't include long explanations. Use screenshots and bullet points instead of descriptive paragraphs.
|
||||
2. **Report each issue separately:** Don't club multiple, unreleated issues in one note.
|
||||
3. **Brief:** Please don't include long explanations. Use screenshots and bullet points instead of descriptive paragraphs.
|
||||
|
||||
### Bug Report Guidelines
|
||||
|
||||
|
|
|
|||
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
---
|
||||
name: Bug report
|
||||
about: Report a bug encountered while using the Frappe Framework
|
||||
labels: bug
|
||||
---
|
||||
|
||||
<!--
|
||||
Welcome to the Frappe Framework issue tracker! Before creating an issue, please heed the following:
|
||||
|
||||
1. This tracker should only be used to report bugs and request features / enhancements to Frappe
|
||||
- For questions and general support, use https://discuss.frappe.io
|
||||
- For documentation issues, refer to https://frappe.io/docs/user/en or the developer cheetsheet https://github.com/frappe/frappe/wiki/Developer-Cheatsheet
|
||||
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
|
||||
the original discussion.
|
||||
3. When making a bug report, make sure you provide all required information. The easier it is for
|
||||
maintainers to reproduce, the faster it'll be fixed.
|
||||
4. If you think you know what the reason for the bug is, share it with us. Maybe put in a PR 😉
|
||||
-->
|
||||
|
||||
## Description of the issue
|
||||
|
||||
## Context information (for bug reports)
|
||||
|
||||
**Output of `bench version`**
|
||||
```
|
||||
(paste here)
|
||||
```
|
||||
|
||||
## Steps to reproduce the issue
|
||||
|
||||
1.
|
||||
2.
|
||||
3.
|
||||
|
||||
### Observed result
|
||||
|
||||
### Expected result
|
||||
|
||||
### Stacktrace / full error message
|
||||
|
||||
```
|
||||
(paste here)
|
||||
```
|
||||
|
||||
## Additional information
|
||||
|
||||
OS version / distribution, `Frappe` install method, etc.
|
||||
28
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
28
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea to improve Frappe
|
||||
labels: feature-request
|
||||
---
|
||||
|
||||
<!--
|
||||
Welcome to the Frappe Framework issue tracker! Before creating an issue, please heed the following:
|
||||
|
||||
1. This tracker should only be used to report bugs and request features / enhancements to Frappe
|
||||
- For questions and general support, refer to https://discuss.frappe.io
|
||||
- For documentation issues, use https://frappe.io/docs/user/en or the developer cheetsheet https://github.com/frappe/frappe/wiki/Developer-Cheatsheet
|
||||
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
|
||||
the original discussion.
|
||||
3. When making a feature request, make sure to be as verbose as possible. The better you convey your message, the greater the drive to make it happen.
|
||||
-->
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
19
.github/ISSUE_TEMPLATE/question-about-using-frappe.md
vendored
Normal file
19
.github/ISSUE_TEMPLATE/question-about-using-frappe.md
vendored
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
---
|
||||
name: Question about using Frappe/Frappe Apps
|
||||
about: This is not the appropriate channel
|
||||
labels: invalid
|
||||
---
|
||||
|
||||
Please post on our forums:
|
||||
|
||||
for questions about using the `Frappe Framework`: https://discuss.frappe.io
|
||||
|
||||
for questions about using `ERPNext`: https://discuss.erpnext.com
|
||||
|
||||
for questions about using `bench`, probably the best place to start is the [bench repo](https://github.com/frappe/bench)
|
||||
|
||||
For documentation issues, use the [Frappe Framework Documentation](https://frappe.io/docs/user/en) or the [developer cheetsheet](https://github.com/frappe/frappe/wiki/Developer-Cheatsheet)
|
||||
|
||||
For a slightly outdated yet informative developer guide: https://www.youtube.com/playlist?list=PL3lFfCEoMxvzHtsZHFJ4T3n5yMM3nGJ1W
|
||||
|
||||
> **Posts that are not bug reports or feature requests will not be addressed on this issue tracker.**
|
||||
34
.github/PULL_REQUEST_TEMPLATE.md
vendored
34
.github/PULL_REQUEST_TEMPLATE.md
vendored
|
|
@ -1 +1,33 @@
|
|||
Please read the [Pull Request Checklist](https://github.com/frappe/erpnext/wiki/Pull-Request-Checklist) to ensure you have everything that is needed to get your contribution merged.
|
||||
<!--
|
||||
|
||||
Some key notes before you open a PR:
|
||||
|
||||
1. Select which branch should this PR be merged in?
|
||||
2. PR name follows [convention](http://karma-runner.github.io/4.0/dev/git-commit-msg.html)
|
||||
3. All tests pass locally, UI and Unit tests
|
||||
4. All business logic and validations must be on the server-side
|
||||
5. Update necessary Documentation
|
||||
6. Put `closes #XXXX` in your comment to auto-close the issue that your PR fixes
|
||||
|
||||
|
||||
Also, if you're new here
|
||||
|
||||
- Documentation Guidelines => https://github.com/frappe/erpnext/wiki/Updating-Documentation
|
||||
|
||||
- Contribution Guide => https://github.com/frappe/frappe/blob/develop/.github/CONTRIBUTING.md
|
||||
|
||||
- Pull Request Checklist => https://github.com/frappe/erpnext/wiki/Pull-Request-Checklist
|
||||
|
||||
-->
|
||||
|
||||
> Please provide enough information so that others can review your pull request:
|
||||
|
||||
<!-- You can skip this if you're fixing a typo or updating existing documentation -->
|
||||
|
||||
> Explain the **details** for making this change. What existing problem does the pull request solve?
|
||||
|
||||
<!-- Example: When "Adding a function to do X", explain why it is necessary to have a way to do X. -->
|
||||
|
||||
> Screenshots/GIFs
|
||||
|
||||
<!-- Add images/recordings to better visualize the change: expected/current behviour -->
|
||||
|
|
|
|||
7
SECURITY.md
Normal file
7
SECURITY.md
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
# Security Policy
|
||||
|
||||
The Frappe team and community take security issues in the Frappe Framework seriously. To report a security issue, fill out the form at [https://erpnext.com/security/report](https://erpnext.com/security/report).
|
||||
|
||||
You can help us make Frappe and consequently all Frappe dependent apps like [ERPNext](https://erpnext.com) more secure by following the [Reporting guidelines](https://erpnext.com/security).
|
||||
|
||||
We appreciate your efforts to responsibly disclose your findings. We'll endeavor to respond quickly, and will keep you updated throughout the process.
|
||||
38
cypress/integration/api.js
Normal file
38
cypress/integration/api.js
Normal file
|
|
@ -0,0 +1,38 @@
|
|||
context('API Resources', () => {
|
||||
before(() => {
|
||||
cy.visit('/login');
|
||||
cy.login();
|
||||
cy.visit('/desk');
|
||||
});
|
||||
|
||||
it('Creates two Comments', () => {
|
||||
cy.create_doc('Comment', {comment_type: 'Comment', content: "hello"});
|
||||
cy.create_doc('Comment', {comment_type: 'Comment', content: "world"});
|
||||
});
|
||||
|
||||
it('Lists the Comments', () => {
|
||||
cy.get_list('Comment')
|
||||
.its('data')
|
||||
.then(data => expect(data.length).to.be.at.least(2));
|
||||
|
||||
cy.get_list('Comment', ['name', 'content'], [['content', '=', 'hello']])
|
||||
.then(body => {
|
||||
expect(body).to.have.property('data');
|
||||
expect(body.data).to.have.lengthOf(1);
|
||||
expect(body.data[0]).to.have.property('content');
|
||||
expect(body.data[0]).to.have.property('name');
|
||||
});
|
||||
});
|
||||
|
||||
it('Gets each Comment', () => {
|
||||
cy.get_list('Comment').then(body => body.data.forEach(comment => {
|
||||
cy.get_doc('Comment', comment.name);
|
||||
}));
|
||||
});
|
||||
|
||||
it('Removes the Comments', () => {
|
||||
cy.get_list('Comment').then(body => body.data.forEach(comment => {
|
||||
cy.remove_doc('Comment', comment.name);
|
||||
}));
|
||||
});
|
||||
});
|
||||
|
|
@ -21,11 +21,11 @@ context('List View', () => {
|
|||
url:'api/method/frappe.model.workflow.bulk_workflow_approval'
|
||||
}).as('bulk-approval');
|
||||
cy.route({
|
||||
method: 'GET',
|
||||
url:'api/method/frappe.desk.reportview.get*'
|
||||
}).as('update-list');
|
||||
method: 'POST',
|
||||
url:'api/method/frappe.desk.reportview.get'
|
||||
}).as('real-time-update');
|
||||
cy.wrap(elements).contains('Approve').click();
|
||||
cy.wait(['@bulk-approval', '@update-list']);
|
||||
cy.wait(['@bulk-approval', '@real-time-update']);
|
||||
cy.get('.list-row-container:visible').should('contain', 'Approved');
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -7,22 +7,27 @@ context('Form', () => {
|
|||
it('add custom column in report', () => {
|
||||
cy.visit('/desk#query-report/Permitted Documents For User');
|
||||
|
||||
cy.get('#page-query-report input[data-fieldname="user"]').as('input');
|
||||
cy.get('@input').focus().type('test@erpnext.com', { delay: 100 });
|
||||
cy.get('#page-query-report input[data-fieldname="doctype"]').as('input-test');
|
||||
cy.get('@input-test').focus().type('Role', { delay: 100 }).blur();
|
||||
cy.get('.datatable').should('exist');
|
||||
cy.get('button').contains('Menu').click({force: true});
|
||||
cy.get('.dropdown-menu li').contains('Add Column').click({force: true});
|
||||
cy.get('.modal-dialog').should('contain', 'Add Column');
|
||||
cy.get('select[data-fieldname="doctype"]').select("Role", {force: true});
|
||||
cy.get('select[data-fieldname="field"]').select("Role Name", {force: true});
|
||||
cy.get('select[data-fieldname="insert_after"]').select("Name", {force: true});
|
||||
cy.get('button').contains('Submit').click({force: true});
|
||||
cy.get('button').contains('Menu').click({force: true});
|
||||
cy.get('.dropdown-menu li').contains('Save').click({force: true});
|
||||
cy.get('.modal-dialog').should('contain', 'Save Report');
|
||||
cy.get('input[data-fieldname="report_name"]').type("Test Report", {force: true});
|
||||
cy.get('button').contains('Submit').click({force: true});
|
||||
cy.get('div[class="page-form flex"]', {timeout: 60000}).should('have.length', 1).then(()=>{
|
||||
cy.get('#page-query-report input[data-fieldname="user"]').as('input');
|
||||
cy.get('@input').focus().type('test@erpnext.com', { delay: 100 });
|
||||
|
||||
cy.get('#page-query-report input[data-fieldname="doctype"]').as('input-test');
|
||||
cy.get('@input-test').focus().type('Role', { delay: 100 }).blur();
|
||||
|
||||
cy.get('.datatable').should('exist');
|
||||
cy.get('button').contains('Menu').click({force: true});
|
||||
cy.get('.dropdown-menu li').contains('Add Column').click({force: true});
|
||||
cy.get('.modal-dialog').should('contain', 'Add Column');
|
||||
cy.get('select[data-fieldname="doctype"]').select("Role", {force: true});
|
||||
cy.get('select[data-fieldname="field"]').select("Role Name", {force: true});
|
||||
cy.get('select[data-fieldname="insert_after"]').select("Name", {force: true});
|
||||
cy.get('button').contains('Submit').click({force: true});
|
||||
cy.get('button').contains('Menu').click({force: true});
|
||||
cy.get('.dropdown-menu li').contains('Save').click({force: true});
|
||||
cy.get('.modal-dialog').should('contain', 'Save Report');
|
||||
|
||||
cy.get('input[data-fieldname="report_name"]').type("Test Report", {delay:100, force: true});
|
||||
cy.get('button').contains('Submit').click({timeout:1000, force: true});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
@ -59,6 +59,72 @@ Cypress.Commands.add('call', (method, args) => {
|
|||
});
|
||||
});
|
||||
|
||||
Cypress.Commands.add('get_list', (doctype, fields=[], filters=[]) => {
|
||||
return cy.window().its('frappe.csrf_token').then(csrf_token => {
|
||||
return cy.request({
|
||||
method: 'GET',
|
||||
url: `/api/resource/${doctype}?fields=${JSON.stringify(fields)}&filters=${JSON.stringify(filters)}`,
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'X-Frappe-CSRF-Token': csrf_token
|
||||
}
|
||||
}).then(res => {
|
||||
expect(res.status).eq(200);
|
||||
return res.body;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
Cypress.Commands.add('get_doc', (doctype, name) => {
|
||||
return cy.window().its('frappe.csrf_token').then(csrf_token => {
|
||||
return cy.request({
|
||||
method: 'GET',
|
||||
url: `/api/resource/${doctype}/${name}`,
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'X-Frappe-CSRF-Token': csrf_token
|
||||
}
|
||||
}).then(res => {
|
||||
expect(res.status).eq(200);
|
||||
return res.body;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
Cypress.Commands.add('create_doc', (doctype, args) => {
|
||||
return cy.window().its('frappe.csrf_token').then(csrf_token => {
|
||||
return cy.request({
|
||||
method: 'POST',
|
||||
url: `/api/resource/${doctype}`,
|
||||
body: args,
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
'X-Frappe-CSRF-Token': csrf_token
|
||||
}
|
||||
}).then(res => {
|
||||
expect(res.status).eq(200);
|
||||
return res.body;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
Cypress.Commands.add('remove_doc', (doctype, name) => {
|
||||
return cy.window().its('frappe.csrf_token').then(csrf_token => {
|
||||
return cy.request({
|
||||
method: 'DELETE',
|
||||
url: `/api/resource/${doctype}/${name}`,
|
||||
headers: {
|
||||
'Accept': 'application/json',
|
||||
'X-Frappe-CSRF-Token': csrf_token
|
||||
}
|
||||
}).then(res => {
|
||||
expect(res.status).eq(202);
|
||||
return res.body;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
Cypress.Commands.add('create_records', (doc) => {
|
||||
return cy.call('frappe.tests.ui_test_helpers.create_if_not_exists', { doc })
|
||||
.then(r => r.message);
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ if sys.version[0] == '2':
|
|||
reload(sys)
|
||||
sys.setdefaultencoding("utf-8")
|
||||
|
||||
__version__ = '12.0.14'
|
||||
__version__ = '12.0.16'
|
||||
__title__ = "Frappe Framework"
|
||||
|
||||
local = Local()
|
||||
|
|
@ -1039,7 +1039,13 @@ def get_newargs(fn, kwargs):
|
|||
if hasattr(fn, 'fnargs'):
|
||||
fnargs = fn.fnargs
|
||||
else:
|
||||
fnargs, varargs, varkw, defaults = inspect.getargspec(fn)
|
||||
try:
|
||||
fnargs, varargs, varkw, defaults = inspect.getargspec(fn)
|
||||
except ValueError:
|
||||
fnargs = inspect.getfullargspec(fn).args
|
||||
varargs = inspect.getfullargspec(fn).varargs
|
||||
varkw = inspect.getfullargspec(fn).varkw
|
||||
defaults = inspect.getfullargspec(fn).defaults
|
||||
|
||||
newargs = {}
|
||||
for a in kwargs:
|
||||
|
|
@ -1409,8 +1415,9 @@ def publish_progress(*args, **kwargs):
|
|||
|
||||
:param percent: Percent progress
|
||||
:param title: Title
|
||||
:param doctype: Optional, for DocType
|
||||
:param name: Optional, for Document name
|
||||
:param doctype: Optional, for document type
|
||||
:param docname: Optional, for document name
|
||||
:param description: Optional description
|
||||
"""
|
||||
import frappe.realtime
|
||||
return frappe.realtime.publish_progress(*args, **kwargs)
|
||||
|
|
|
|||
|
|
@ -138,7 +138,7 @@ class LoginManager:
|
|||
|
||||
def post_login(self):
|
||||
self.run_trigger('on_login')
|
||||
self.validate_ip_address()
|
||||
validate_ip_address(self.user)
|
||||
self.validate_hour()
|
||||
self.get_user_info()
|
||||
self.make_session()
|
||||
|
|
@ -271,28 +271,6 @@ class LoginManager:
|
|||
for method in frappe.get_hooks().get(event, []):
|
||||
frappe.call(frappe.get_attr(method), login_manager=self)
|
||||
|
||||
def validate_ip_address(self):
|
||||
"""check if IP Address is valid"""
|
||||
user = frappe.get_doc("User", self.user)
|
||||
ip_list = user.get_restricted_ip_list()
|
||||
if not ip_list:
|
||||
return
|
||||
|
||||
bypass_restrict_ip_check = 0
|
||||
# check if two factor auth is enabled
|
||||
enabled = int(frappe.get_system_settings('enable_two_factor_auth') or 0)
|
||||
if enabled:
|
||||
#check if bypass restrict ip is enabled for all users
|
||||
bypass_restrict_ip_check = int(frappe.get_system_settings('bypass_restrict_ip_check_if_2fa_enabled') or 0)
|
||||
if not bypass_restrict_ip_check:
|
||||
#check if bypass restrict ip is enabled for login user
|
||||
bypass_restrict_ip_check = int(frappe.db.get_value('User', self.user, 'bypass_restrict_ip_check_if_2fa_enabled') or 0)
|
||||
for ip in ip_list:
|
||||
if frappe.local.request_ip.startswith(ip) or bypass_restrict_ip_check:
|
||||
return
|
||||
|
||||
frappe.throw(_("Not allowed from this IP Address"), frappe.AuthenticationError)
|
||||
|
||||
def validate_hour(self):
|
||||
"""check if user is logging in during restricted hours"""
|
||||
login_before = int(frappe.db.get_value('User', self.user, 'login_before', ignore=True) or 0)
|
||||
|
|
@ -416,3 +394,25 @@ def check_consecutive_login_attempts(user, doc):
|
|||
.format(doc.allow_login_after_fail), frappe.SecurityException)
|
||||
else:
|
||||
delete_login_failed_cache(user)
|
||||
|
||||
def validate_ip_address(user):
|
||||
"""check if IP Address is valid"""
|
||||
user = frappe.get_cached_doc("User", user) if not frappe.flags.in_test else frappe.get_doc("User", user)
|
||||
ip_list = user.get_restricted_ip_list()
|
||||
if not ip_list:
|
||||
return
|
||||
|
||||
system_settings = frappe.get_cached_doc("System Settings") if not frappe.flags.in_test else frappe.get_single("System Settings")
|
||||
# check if bypass restrict ip is enabled for all users
|
||||
bypass_restrict_ip_check = system_settings.bypass_restrict_ip_check_if_2fa_enabled
|
||||
|
||||
# check if two factor auth is enabled
|
||||
if system_settings.enable_two_factor_auth and not bypass_restrict_ip_check:
|
||||
# check if bypass restrict ip is enabled for login user
|
||||
bypass_restrict_ip_check = user.bypass_restrict_ip_check_if_2fa_enabled
|
||||
|
||||
for ip in ip_list:
|
||||
if frappe.local.request_ip.startswith(ip) or bypass_restrict_ip_check:
|
||||
return
|
||||
|
||||
frappe.throw(_("Access not allowed from this IP Address"), frappe.AuthenticationError)
|
||||
|
|
|
|||
|
|
@ -18,6 +18,8 @@
|
|||
"unassign_condition",
|
||||
"section_break_10",
|
||||
"close_condition",
|
||||
"sb",
|
||||
"assignment_days",
|
||||
"assign_to_users_section",
|
||||
"rule",
|
||||
"users",
|
||||
|
|
@ -115,9 +117,21 @@
|
|||
"fieldname": "close_condition",
|
||||
"fieldtype": "Code",
|
||||
"label": "Close Condition"
|
||||
},
|
||||
{
|
||||
"fieldname": "sb",
|
||||
"fieldtype": "Section Break",
|
||||
"label": "Assignment Days"
|
||||
},
|
||||
{
|
||||
"fieldname": "assignment_days",
|
||||
"fieldtype": "Table",
|
||||
"label": "Assignment Days",
|
||||
"options": "Assignment Rule Day",
|
||||
"reqd": 1
|
||||
}
|
||||
],
|
||||
"modified": "2019-09-10 14:45:53.657667",
|
||||
"modified": "2019-09-25 14:52:12.214514",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Automation",
|
||||
"name": "Assignment Rule",
|
||||
|
|
|
|||
|
|
@ -8,8 +8,16 @@ import frappe
|
|||
from frappe.model.document import Document
|
||||
from frappe.desk.form import assign_to
|
||||
import frappe.cache_manager
|
||||
from frappe import _
|
||||
|
||||
class AssignmentRule(Document):
|
||||
|
||||
def validate(self):
|
||||
assignment_days = self.get_assignment_days()
|
||||
if not len(set(assignment_days)) == len(assignment_days):
|
||||
repeated_days = get_repeated(assignment_days)
|
||||
frappe.throw(_("Assignment Day {0} has been repeated.".format(frappe.bold(repeated_days))))
|
||||
|
||||
def on_update(self): # pylint: disable=no-self-use
|
||||
frappe.cache_manager.clear_doctype_map('Assignment Rule', self.name)
|
||||
|
||||
|
|
@ -118,6 +126,17 @@ class AssignmentRule(Document):
|
|||
|
||||
return False
|
||||
|
||||
def get_assignment_days(self):
|
||||
return [d.day for d in self.get('assignment_days', [])]
|
||||
|
||||
def is_rule_not_applicable_today(self):
|
||||
today = frappe.flags.assignment_day or frappe.utils.get_weekday()
|
||||
assignment_days = self.get_assignment_days()
|
||||
if assignment_days and not today in assignment_days:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def get_assignments(doc):
|
||||
return frappe.get_all('ToDo', fields = ['name', 'assignment_rule'], filters = dict(
|
||||
reference_type = doc.get('doctype'),
|
||||
|
|
@ -181,6 +200,9 @@ def apply(doc, method=None, doctype=None, name=None):
|
|||
# so when the value switches from L1 to L2, L1 team must be unassigned, then L2 can be assigned.
|
||||
clear = False
|
||||
for assignment_rule in assignment_rule_docs:
|
||||
if assignment_rule.is_rule_not_applicable_today():
|
||||
continue
|
||||
|
||||
clear = assignment_rule.apply_unassign(doc, assignments)
|
||||
if clear:
|
||||
break
|
||||
|
|
@ -188,6 +210,9 @@ def apply(doc, method=None, doctype=None, name=None):
|
|||
# apply rule only if there are no existing assignments
|
||||
if clear:
|
||||
for assignment_rule in assignment_rule_docs:
|
||||
if assignment_rule.is_rule_not_applicable_today():
|
||||
continue
|
||||
|
||||
new_apply = assignment_rule.apply_assign(doc)
|
||||
if new_apply:
|
||||
break
|
||||
|
|
@ -196,6 +221,9 @@ def apply(doc, method=None, doctype=None, name=None):
|
|||
assignments = get_assignments(doc)
|
||||
if assignments:
|
||||
for assignment_rule in assignment_rule_docs:
|
||||
if assignment_rule.is_rule_not_applicable_today():
|
||||
continue
|
||||
|
||||
if not new_apply:
|
||||
reopen = reopen_closed_assignment(doc)
|
||||
if reopen:
|
||||
|
|
@ -207,3 +235,14 @@ def apply(doc, method=None, doctype=None, name=None):
|
|||
|
||||
def get_assignment_rules():
|
||||
return [d.document_type for d in frappe.db.get_all('Assignment Rule', fields=['document_type'], filters=dict(disabled = 0))]
|
||||
|
||||
def get_repeated(values):
|
||||
unique_list = []
|
||||
diff = []
|
||||
for value in values:
|
||||
if value not in unique_list:
|
||||
unique_list.append(str(value))
|
||||
else:
|
||||
if value not in diff:
|
||||
diff.append(str(value))
|
||||
return " ".join(diff)
|
||||
|
|
|
|||
|
|
@ -6,10 +6,21 @@ from __future__ import unicode_literals
|
|||
import frappe
|
||||
import unittest
|
||||
from frappe.utils import random_string
|
||||
from frappe.test_runner import make_test_records
|
||||
|
||||
class TestAutoAssign(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.assignment_rule = get_assignment_rule()
|
||||
make_test_records("User")
|
||||
days = [
|
||||
dict(day = 'Sunday'),
|
||||
dict(day = 'Monday'),
|
||||
dict(day = 'Tuesday'),
|
||||
dict(day = 'Wednesday'),
|
||||
dict(day = 'Thursday'),
|
||||
dict(day = 'Friday'),
|
||||
dict(day = 'Saturday'),
|
||||
]
|
||||
self.assignment_rule = get_assignment_rule([days, days])
|
||||
clear_assignments()
|
||||
|
||||
def test_round_robin(self):
|
||||
|
|
@ -142,21 +153,52 @@ class TestAutoAssign(unittest.TestCase):
|
|||
status = 'Open'
|
||||
), 'owner'), 'test@example.com')
|
||||
|
||||
def check_assignment_rule_scheduling(self):
|
||||
frappe.db.sql("DELETE FROM `tabAssignment Rule`")
|
||||
|
||||
days_1 = [dict(day = 'Sunday'), dict(day = 'Monday'), dict(day = 'Tuesday')]
|
||||
|
||||
days_2 = [dict(day = 'Wednesday'), dict(day = 'Thursday'), dict(day = 'Friday'), dict(day = 'Saturday')]
|
||||
|
||||
get_assignment_rule([days_1, days_2], ['public == 1', 'public == 1'])
|
||||
|
||||
frappe.flags.assignment_day = "Monday"
|
||||
note = make_note(dict(public=1))
|
||||
|
||||
self.assertIn(frappe.db.get_value('ToDo', dict(
|
||||
reference_type = 'Note',
|
||||
reference_name = note.name,
|
||||
status = 'Open'
|
||||
), 'owner'), ['test@example.com', 'test1@example.com', 'test2@example.com'])
|
||||
|
||||
frappe.flags.assignment_day = "Friday"
|
||||
note = make_note(dict(public=1))
|
||||
|
||||
self.assertIn(frappe.db.get_value('ToDo', dict(
|
||||
reference_type = 'Note',
|
||||
reference_name = note.name,
|
||||
status = 'Open'
|
||||
), 'owner'), ['test3@example.com'])
|
||||
|
||||
def clear_assignments():
|
||||
frappe.db.sql("delete from tabToDo where reference_type = 'Note'")
|
||||
|
||||
def get_assignment_rule():
|
||||
def get_assignment_rule(days, assign=None):
|
||||
frappe.delete_doc_if_exists('Assignment Rule', 'For Note 1')
|
||||
|
||||
if not assign:
|
||||
assign = ['public == 1', 'notify_on_login == 1']
|
||||
|
||||
assignment_rule = frappe.get_doc(dict(
|
||||
name = 'For Note 1',
|
||||
doctype = 'Assignment Rule',
|
||||
priority = 0,
|
||||
document_type = 'Note',
|
||||
assign_condition = 'public == 1',
|
||||
assign_condition = assign[0],
|
||||
unassign_condition = 'public == 0 or notify_on_login == 1',
|
||||
close_condition = '"Closed" in content',
|
||||
rule = 'Round Robin',
|
||||
assignment_days = days[0],
|
||||
users = [
|
||||
dict(user = 'test@example.com'),
|
||||
dict(user = 'test1@example.com'),
|
||||
|
|
@ -172,15 +214,15 @@ def get_assignment_rule():
|
|||
doctype = 'Assignment Rule',
|
||||
priority = 1,
|
||||
document_type = 'Note',
|
||||
assign_condition = 'notify_on_login == 1',
|
||||
assign_condition = assign[1],
|
||||
unassign_condition = 'notify_on_login == 0',
|
||||
rule = 'Round Robin',
|
||||
assignment_days = days[1],
|
||||
users = [
|
||||
dict(user = 'test3@example.com')
|
||||
]
|
||||
)).insert()
|
||||
|
||||
|
||||
return assignment_rule
|
||||
|
||||
def make_note(values=None):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,28 @@
|
|||
{
|
||||
"creation": "2019-09-21 16:52:01.705351",
|
||||
"doctype": "DocType",
|
||||
"editable_grid": 1,
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"day"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "day",
|
||||
"fieldtype": "Select",
|
||||
"in_list_view": 1,
|
||||
"label": "Day",
|
||||
"options": "Monday\nTuesday\nWednesday\nThursday\nFriday\nSaturday\nSunday"
|
||||
}
|
||||
],
|
||||
"istable": 1,
|
||||
"modified": "2019-09-21 16:55:09.376291",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Automation",
|
||||
"name": "Assignment Rule Day",
|
||||
"owner": "Administrator",
|
||||
"permissions": [],
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1
|
||||
}
|
||||
|
|
@ -1,10 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2015, Frappe Technologies and contributors
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
# import frappe
|
||||
from frappe.model.document import Document
|
||||
|
||||
class TagDocCategory(Document):
|
||||
class AssignmentRuleDay(Document):
|
||||
pass
|
||||
|
|
@ -293,6 +293,33 @@ def import_csv(context, path, only_insert=False, submit_after_import=False, igno
|
|||
|
||||
frappe.destroy()
|
||||
|
||||
|
||||
@click.command('data-import')
|
||||
@click.option('--file', 'file_path', type=click.Path(), required=True, help="Path to import file (.csv, .xlsx)")
|
||||
@click.option('--doctype', type=str, required=True)
|
||||
@click.option('--type', 'import_type', type=click.Choice(['Insert', 'Update'], case_sensitive=False), default='Insert', help="Insert New Records or Update Existing Records")
|
||||
@click.option('--submit-after-import', default=False, is_flag=True, help='Submit document after importing it')
|
||||
@click.option('--mute-emails', default=True, is_flag=True, help='Mute emails during import')
|
||||
@pass_context
|
||||
def data_import(context, file_path, doctype, import_type=None, submit_after_import=False, mute_emails=True):
|
||||
"Import documents in bulk from CSV or XLSX using data import"
|
||||
from frappe.core.doctype.data_import.importer_new import Importer
|
||||
site = get_site(context)
|
||||
|
||||
frappe.init(site=site)
|
||||
frappe.connect()
|
||||
|
||||
data_import = frappe.new_doc('Data Import Beta')
|
||||
data_import.submit_after_import = submit_after_import
|
||||
data_import.mute_emails = mute_emails
|
||||
data_import.import_type = 'Insert New Records' if import_type.lower() == 'insert' else 'Update Existing Records'
|
||||
|
||||
i = Importer(doctype=doctype, file_path=file_path, data_import=data_import, console=True)
|
||||
i.import_data()
|
||||
|
||||
frappe.destroy()
|
||||
|
||||
|
||||
@click.command('bulk-rename')
|
||||
@click.argument('doctype')
|
||||
@click.argument('path')
|
||||
|
|
@ -715,6 +742,7 @@ commands = [
|
|||
export_json,
|
||||
get_version,
|
||||
import_csv,
|
||||
data_import,
|
||||
import_doc,
|
||||
make_app,
|
||||
mysql,
|
||||
|
|
|
|||
|
|
@ -47,7 +47,8 @@ def load_address_and_contact(doc, key=None):
|
|||
contact["phone_nos"] = frappe.get_list("Contact Phone", filters={
|
||||
"parenttype": "Contact",
|
||||
"parent": contact.name,
|
||||
"is_primary": 0
|
||||
"is_primary_phone": 0,
|
||||
"is_primary_mobile_no": 0
|
||||
}, fields=["phone"])
|
||||
|
||||
if contact.address:
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@
|
|||
"designation",
|
||||
"gender",
|
||||
"phone",
|
||||
"mobile_no",
|
||||
"company_name",
|
||||
"image",
|
||||
"sb_00",
|
||||
|
|
@ -192,9 +193,15 @@
|
|||
{
|
||||
"fieldname": "phone_nos",
|
||||
"fieldtype": "Table",
|
||||
"label": "Phone Nos",
|
||||
"label": "Contact Numbers",
|
||||
"options": "Contact Phone"
|
||||
},
|
||||
{
|
||||
"fieldname": "mobile_no",
|
||||
"fieldtype": "Data",
|
||||
"label": "Mobile No",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "pulled_from_google_contacts",
|
||||
|
|
@ -238,8 +245,8 @@
|
|||
"icon": "fa fa-user",
|
||||
"idx": 1,
|
||||
"image_field": "image",
|
||||
"modified": "2019-09-13 15:50:38.999884",
|
||||
"modified_by": "himanshu@erpnext.com",
|
||||
"modified": "2019-10-10 22:04:41.070479",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Contacts",
|
||||
"name": "Contact",
|
||||
"name_case": "Title Case",
|
||||
|
|
|
|||
|
|
@ -29,11 +29,9 @@ class Contact(Document):
|
|||
break
|
||||
|
||||
def validate(self):
|
||||
self.set_primary("email_id", "email_ids")
|
||||
self.set_primary("phone", "phone_nos")
|
||||
|
||||
if self.email_id:
|
||||
self.email_id = self.email_id.strip()
|
||||
self.set_primary_email()
|
||||
self.set_primary("phone")
|
||||
self.set_primary("mobile_no")
|
||||
|
||||
self.set_user()
|
||||
|
||||
|
|
@ -79,24 +77,46 @@ class Contact(Document):
|
|||
if autosave:
|
||||
self.save(ignore_permissions=True)
|
||||
|
||||
def add_phone(self, phone, is_primary=0, autosave=False):
|
||||
def add_phone(self, phone, is_primary_phone=0, is_primary_mobile_no=0, autosave=False):
|
||||
self.append("phone_nos", {
|
||||
"phone": phone,
|
||||
"is_primary": is_primary
|
||||
"is_primary_phone": is_primary_phone,
|
||||
"is_primary_mobile_no": is_primary_mobile_no
|
||||
})
|
||||
|
||||
if autosave:
|
||||
self.save(ignore_permissions=True)
|
||||
|
||||
def set_primary(self, fieldname, child_table):
|
||||
if len(self.get(child_table)) == 1:
|
||||
self.get(child_table)[0].is_primary = 1
|
||||
setattr(self, fieldname, self.get(child_table)[0].get(fieldname))
|
||||
else:
|
||||
for d in self.get(child_table):
|
||||
if d.is_primary == 1:
|
||||
setattr(self, fieldname, d.get(fieldname))
|
||||
break
|
||||
def set_primary_email(self):
|
||||
if not self.email_ids:
|
||||
self.email_id = ""
|
||||
return
|
||||
|
||||
if len([email.email_id for email in self.email_ids if email.is_primary]) > 1:
|
||||
frappe.throw(_("Only one {0} can be set as primary.").format(frappe.bold("Email ID")))
|
||||
|
||||
for d in self.email_ids:
|
||||
if d.is_primary == 1:
|
||||
self.email_id = d.email_id.strip()
|
||||
break
|
||||
|
||||
def set_primary(self, fieldname):
|
||||
# Used to set primary mobile and phone no.
|
||||
if len(self.phone_nos) == 0:
|
||||
setattr(self, fieldname, "")
|
||||
return
|
||||
|
||||
field_name = "is_primary_" + fieldname
|
||||
|
||||
is_primary = [phone.phone for phone in self.phone_nos if phone.get(field_name)]
|
||||
|
||||
if len(is_primary) > 1:
|
||||
frappe.throw(_("Only one {0} can be set as primary.").format(frappe.bold(frappe.unscrub(fieldname))))
|
||||
|
||||
for d in self.phone_nos:
|
||||
if d.get(field_name) == 1:
|
||||
setattr(self, fieldname, d.phone)
|
||||
break
|
||||
|
||||
def get_default_contact(doctype, name):
|
||||
'''Returns default contact for the given doctype, name'''
|
||||
|
|
|
|||
|
|
@ -5,8 +5,51 @@ from __future__ import unicode_literals
|
|||
|
||||
import frappe
|
||||
import unittest
|
||||
|
||||
test_records = frappe.get_test_records('Contact')
|
||||
from frappe.exceptions import ValidationError
|
||||
|
||||
class TestContact(unittest.TestCase):
|
||||
pass
|
||||
|
||||
def test_check_default_email(self):
|
||||
emails = [
|
||||
{"email": "test1@example.com", "is_primary": 0},
|
||||
{"email": "test2@example.com", "is_primary": 0},
|
||||
{"email": "test3@example.com", "is_primary": 0},
|
||||
{"email": "test4@example.com", "is_primary": 1},
|
||||
{"email": "test5@example.com", "is_primary": 0},
|
||||
]
|
||||
contact = create_contact("Email", "Mr", emails=emails)
|
||||
|
||||
self.assertEqual(contact.email_id, "test4@example.com")
|
||||
|
||||
def test_check_default_phone_and_mobile(self):
|
||||
phones = [
|
||||
{"phone": "+91 0000000000", "is_primary_phone": 0, "is_primary_mobile_no": 0},
|
||||
{"phone": "+91 0000000001", "is_primary_phone": 0, "is_primary_mobile_no": 0},
|
||||
{"phone": "+91 0000000002", "is_primary_phone": 1, "is_primary_mobile_no": 0},
|
||||
{"phone": "+91 0000000003", "is_primary_phone": 0, "is_primary_mobile_no": 1},
|
||||
]
|
||||
contact = create_contact("Phone", "Mr", phones=phones)
|
||||
|
||||
self.assertEqual(contact.phone, "+91 0000000002")
|
||||
self.assertEqual(contact.mobile_no, "+91 0000000003")
|
||||
|
||||
def create_contact(name, salutation, emails=None, phones=None, save=True):
|
||||
doc = frappe.get_doc({
|
||||
"doctype": "Contact",
|
||||
"first_name": name,
|
||||
"status": "Open",
|
||||
"salutation": salutation
|
||||
})
|
||||
|
||||
if emails:
|
||||
for d in emails:
|
||||
doc.add_email(d.get("email"), d.get("is_primary"))
|
||||
|
||||
if phones:
|
||||
for d in phones:
|
||||
doc.add_phone(d.get("phone"), d.get("is_primary_phone"), d.get("is_primary_mobile_no"))
|
||||
|
||||
if save:
|
||||
doc.insert()
|
||||
|
||||
return doc
|
||||
|
|
@ -1,19 +1,39 @@
|
|||
[
|
||||
{
|
||||
"doctype": "Contact",
|
||||
"salutation": "Mr",
|
||||
"email_id": "test_contact@example.com",
|
||||
"first_name": "_Test Contact For _Test Customer",
|
||||
"is_primary_contact": 1,
|
||||
"phone": "+91 0000000000",
|
||||
"status": "Open"
|
||||
},
|
||||
{
|
||||
"doctype": "Contact",
|
||||
"email_id": "test_contact@example.com",
|
||||
"first_name": "_Test Contact For _Test Supplier",
|
||||
"is_primary_contact": 1,
|
||||
"phone": "+91 0000000000",
|
||||
"status": "Open"
|
||||
}
|
||||
{
|
||||
"doctype": "Contact",
|
||||
"salutation": "Mr",
|
||||
"first_name": "_Test Contact For _Test Customer",
|
||||
"is_primary_contact": 1,
|
||||
"status": "Open",
|
||||
"email_ids": [
|
||||
{
|
||||
"email_id": "test_contact@example.com",
|
||||
"is_primary": 1
|
||||
}
|
||||
],
|
||||
"phone_nos": [
|
||||
{
|
||||
"phone": "+91 0000000000",
|
||||
"is_primary_phone": 1
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"doctype": "Contact",
|
||||
"first_name": "_Test Contact For _Test Supplier",
|
||||
"is_primary_contact": 1,
|
||||
"status": "Open",
|
||||
"email_ids": [
|
||||
{
|
||||
"email_id": "test_contact@example.com",
|
||||
"is_primary": 1
|
||||
}
|
||||
],
|
||||
"phone_nos": [
|
||||
{
|
||||
"phone": "+91 0000000000",
|
||||
"is_primary_phone": 1
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
|
|
@ -13,9 +13,11 @@
|
|||
"fieldtype": "Data",
|
||||
"in_list_view": 1,
|
||||
"label": "Email ID",
|
||||
"options": "Email"
|
||||
"options": "Email",
|
||||
"reqd": 1
|
||||
},
|
||||
{
|
||||
"columns": 2,
|
||||
"default": "0",
|
||||
"fieldname": "is_primary",
|
||||
"fieldtype": "Check",
|
||||
|
|
@ -24,7 +26,7 @@
|
|||
}
|
||||
],
|
||||
"istable": 1,
|
||||
"modified": "2019-08-02 13:14:22.193463",
|
||||
"modified": "2019-09-24 17:47:30.565805",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Contacts",
|
||||
"name": "Contact Email",
|
||||
|
|
|
|||
|
|
@ -5,25 +5,36 @@
|
|||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"phone",
|
||||
"is_primary"
|
||||
"is_primary_phone",
|
||||
"is_primary_mobile_no"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "is_primary",
|
||||
"fieldtype": "Check",
|
||||
"in_list_view": 1,
|
||||
"label": "Is Primary"
|
||||
},
|
||||
{
|
||||
"fieldname": "phone",
|
||||
"fieldtype": "Data",
|
||||
"in_list_view": 1,
|
||||
"label": "Phone"
|
||||
"label": "Number",
|
||||
"reqd": 1
|
||||
},
|
||||
{
|
||||
"columns": 2,
|
||||
"default": "0",
|
||||
"fieldname": "is_primary_phone",
|
||||
"fieldtype": "Check",
|
||||
"in_list_view": 1,
|
||||
"label": "Is Primary Phone"
|
||||
},
|
||||
{
|
||||
"columns": 2,
|
||||
"default": "0",
|
||||
"fieldname": "is_primary_mobile_no",
|
||||
"fieldtype": "Check",
|
||||
"in_list_view": 1,
|
||||
"label": "Is Primary Mobile"
|
||||
}
|
||||
],
|
||||
"istable": 1,
|
||||
"modified": "2019-08-05 11:40:59.104224",
|
||||
"modified": "2019-09-24 17:47:50.375326",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Contacts",
|
||||
"name": "Contact Phone",
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ import frappe
|
|||
from frappe import _
|
||||
|
||||
field_map = {
|
||||
"Contact": ["first_name", "last_name", "address", "phone", "email_id", "is_primary_contact"],
|
||||
"Contact": ["first_name", "last_name", "address", "phone", "mobile_no", "email_id", "is_primary_contact"],
|
||||
"Address": ["address_line1", "address_line2", "city", "state", "pincode", "country", "is_primary_address"]
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -82,8 +82,8 @@ def create_linked_contact(link_list, address):
|
|||
"address": address,
|
||||
"status": "Open"
|
||||
})
|
||||
contact.add_email("test_contact@example.com")
|
||||
contact.add_phone("+91 0000000000")
|
||||
contact.add_email("test_contact@example.com", is_primary=True)
|
||||
contact.add_phone("+91 0000000000", is_primary_phone=True)
|
||||
|
||||
for name in link_list:
|
||||
contact.append("links",{
|
||||
|
|
@ -103,7 +103,7 @@ class TestAddressesAndContacts(unittest.TestCase):
|
|||
create_linked_contact(links_list, d)
|
||||
report_data = get_data({"reference_doctype": "Test Custom Doctype"})
|
||||
for idx, link in enumerate(links_list):
|
||||
test_item = [link, 'test address line 1', 'test address line 2', 'Milan', None, None, 'Italy', 0, '_Test First Name', '_Test Last Name', '_Test Address-Billing', '+91 0000000000', 'test_contact@example.com', 1]
|
||||
test_item = [link, 'test address line 1', 'test address line 2', 'Milan', None, None, 'Italy', 0, '_Test First Name', '_Test Last Name', '_Test Address-Billing', '+91 0000000000', None, 'test_contact@example.com', 1]
|
||||
self.assertListEqual(test_item, report_data[idx])
|
||||
|
||||
def tearDown(self):
|
||||
|
|
|
|||
|
|
@ -383,7 +383,7 @@
|
|||
],
|
||||
"icon": "fa fa-comment",
|
||||
"idx": 1,
|
||||
"modified": "2019-09-05 14:22:27.664645",
|
||||
"modified": "2019-10-09 14:22:27.664645",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Communication",
|
||||
|
|
|
|||
|
|
@ -381,6 +381,9 @@ def parse_email(communication, email_strings):
|
|||
a doctype and docname ie in the format `admin+doctype+docname@example.com`,
|
||||
the email is parsed and doctype and docname is extracted and timeline link is added.
|
||||
"""
|
||||
if not frappe.get_list("Email Account", filters={"enable_automatic_linking": 1}):
|
||||
return
|
||||
|
||||
delimiter = "+"
|
||||
|
||||
for email_string in email_strings:
|
||||
|
|
@ -388,9 +391,12 @@ def parse_email(communication, email_strings):
|
|||
for email in email_string.split(","):
|
||||
if delimiter in email:
|
||||
email = email.split("@")[0]
|
||||
email_local_parts = email.split(delimiter)
|
||||
if not len(email_local_parts) == 3:
|
||||
continue
|
||||
|
||||
doctype = unquote(email.split(delimiter)[1])
|
||||
docname = unquote(email.split(delimiter)[2])
|
||||
doctype = unquote(email_local_parts[1])
|
||||
docname = unquote(email_local_parts[2])
|
||||
|
||||
if doctype and docname and frappe.db.exists(doctype, docname):
|
||||
communication.add_link(doctype, docname)
|
||||
|
|
@ -400,6 +406,9 @@ def get_email_without_link(email):
|
|||
returns email address without doctype links
|
||||
returns admin@example.com for email admin+doctype+docname@example.com
|
||||
"""
|
||||
if not frappe.get_list("Email Account", filters={"enable_automatic_linking": 1}):
|
||||
return email
|
||||
|
||||
email_id = email.split("@")[0].split("+")[0]
|
||||
email_host = email.split("@")[1]
|
||||
|
||||
|
|
|
|||
|
|
@ -179,6 +179,8 @@ class TestCommunication(unittest.TestCase):
|
|||
def test_link_in_email(self):
|
||||
frappe.delete_doc_if_exists("Note", "test document link in email")
|
||||
|
||||
create_email_account()
|
||||
|
||||
note = frappe.get_doc({
|
||||
"doctype": "Note",
|
||||
"title": "test document link in email",
|
||||
|
|
@ -197,4 +199,34 @@ class TestCommunication(unittest.TestCase):
|
|||
for timeline_link in comm.timeline_links:
|
||||
doc_links.append((timeline_link.link_doctype, timeline_link.link_name))
|
||||
|
||||
self.assertIn(("Note", note.name), doc_links)
|
||||
self.assertIn(("Note", note.name), doc_links)
|
||||
|
||||
def create_email_account():
|
||||
frappe.flags.mute_emails = False
|
||||
frappe.flags.sent_mail = None
|
||||
|
||||
email_account = frappe.get_doc({
|
||||
"is_default": 1,
|
||||
"is_global": 1,
|
||||
"doctype": "Email Account",
|
||||
"domain":"example.com",
|
||||
"append_to": "ToDo",
|
||||
"email_account_name": "_Test Comm Account 1",
|
||||
"enable_outgoing": 1,
|
||||
"smtp_server": "test.example.com",
|
||||
"email_id": "test_comm@example.com",
|
||||
"password": "password",
|
||||
"add_signature": 1,
|
||||
"signature": "\nBest Wishes\nTest Signature",
|
||||
"enable_auto_reply": 1,
|
||||
"auto_reply_message": "",
|
||||
"enable_incoming": 1,
|
||||
"notify_if_unreplied": 1,
|
||||
"unreplied_for_mins": 20,
|
||||
"send_notification_to": "test_comm@example.com",
|
||||
"pop3_server": "pop.test.example.com",
|
||||
"no_remaining":"0",
|
||||
"enable_automatic_linking": 1
|
||||
}).insert(ignore_permissions=True)
|
||||
|
||||
return email_account
|
||||
|
|
@ -7,15 +7,20 @@ frappe.ui.form.on('Data Import', {
|
|||
frm.set_value("action", "");
|
||||
}
|
||||
|
||||
frm.set_query("reference_doctype", function() {
|
||||
return {
|
||||
"filters": {
|
||||
"issingle": 0,
|
||||
"istable": 0,
|
||||
"name": ['in', frappe.boot.user.can_import]
|
||||
}
|
||||
};
|
||||
});
|
||||
frappe.call({
|
||||
method: "frappe.core.doctype.data_import.data_import.get_importable_doc",
|
||||
callback: function (r) {
|
||||
frm.set_query("reference_doctype", function () {
|
||||
return {
|
||||
"filters": {
|
||||
"issingle": 0,
|
||||
"istable": 0,
|
||||
"name": ['in', r.message]
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
}),
|
||||
|
||||
// should never check public
|
||||
frm.fields_dict["import_file"].df.is_private = 1;
|
||||
|
|
|
|||
|
|
@ -29,6 +29,11 @@ class DataImport(Document):
|
|||
upload(data_import_doc=self, from_data_import="Yes", validate_template=True)
|
||||
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_importable_doc():
|
||||
import_lst = frappe.cache().hget("can_import", frappe.session.user)
|
||||
return import_lst
|
||||
|
||||
@frappe.whitelist()
|
||||
def import_data(data_import):
|
||||
frappe.db.set_value("Data Import", data_import, "import_status", "In Progress", update_modified=False)
|
||||
|
|
|
|||
267
frappe/core/doctype/data_import/exporter_new.py
Normal file
267
frappe/core/doctype/data_import/exporter_new.py
Normal file
|
|
@ -0,0 +1,267 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies Pvt. Ltd. and Contributors
|
||||
# MIT License. See license.txt
|
||||
|
||||
import frappe
|
||||
from frappe.model import display_fieldtypes, no_value_fields, table_fields
|
||||
from frappe.utils.csvutils import build_csv_response
|
||||
from frappe.utils.xlsxutils import build_xlsx_response
|
||||
from .importer_new import INVALID_VALUES
|
||||
|
||||
|
||||
class Exporter:
|
||||
def __init__(
|
||||
self,
|
||||
doctype,
|
||||
export_fields=None,
|
||||
export_data=False,
|
||||
export_filters=None,
|
||||
export_page_length=None,
|
||||
file_type="CSV",
|
||||
):
|
||||
"""
|
||||
Exports records of a DocType for use with Importer
|
||||
:param doctype: Document Type to export
|
||||
:param export_fields=None: One of 'All', 'Mandatory' or {'DocType': ['field1', 'field2'], 'Child DocType': ['childfield1']}
|
||||
:param export_data=False: Whether to export data as well
|
||||
:param export_filters=None: The filters (dict or list) which is used to query the records
|
||||
:param file_type: One of 'Excel' or 'CSV'
|
||||
"""
|
||||
self.doctype = doctype
|
||||
self.meta = frappe.get_meta(doctype)
|
||||
self.export_fields = export_fields
|
||||
self.export_filters = export_filters
|
||||
self.export_page_length = export_page_length
|
||||
self.file_type = file_type
|
||||
|
||||
# this will contain the csv content
|
||||
self.csv_array = []
|
||||
|
||||
# fields that get exported
|
||||
# can be All, Mandatory or User Selected Fields
|
||||
self.fields = self.get_all_exportable_fields()
|
||||
self.add_header()
|
||||
|
||||
if export_data:
|
||||
self.data = self.get_data_to_export()
|
||||
else:
|
||||
self.data = []
|
||||
self.add_data()
|
||||
|
||||
def get_all_exportable_fields(self):
|
||||
return self.get_exportable_parent_fields() + self.get_exportable_children_fields()
|
||||
|
||||
def get_exportable_parent_fields(self):
|
||||
parent_fields = self.get_exportable_fields(self.doctype)
|
||||
|
||||
# if autoname is based on field
|
||||
# then merge ID and the field column title as "ID (Autoname Field)"
|
||||
autoname = self.meta.autoname
|
||||
if autoname and autoname.startswith("field:"):
|
||||
fieldname = autoname[len("field:") :]
|
||||
autoname_field = self.meta.get_field(fieldname)
|
||||
if autoname_field:
|
||||
name_field = parent_fields[0]
|
||||
name_field.label = "ID ({})".format(autoname_field.label)
|
||||
# remove the autoname field as it is a duplicate of ID field
|
||||
parent_fields = [
|
||||
df for df in parent_fields if df.fieldname != autoname_field.fieldname
|
||||
]
|
||||
|
||||
return parent_fields
|
||||
|
||||
def get_exportable_children_fields(self):
|
||||
children = [df.options for df in self.meta.fields if df.fieldtype in table_fields]
|
||||
children_fields = []
|
||||
for child in children:
|
||||
children_fields += self.get_exportable_fields(child)
|
||||
|
||||
return children_fields
|
||||
|
||||
def get_exportable_fields(self, doctype):
|
||||
fields = []
|
||||
|
||||
def is_exportable(df):
|
||||
return (
|
||||
df.fieldtype not in display_fieldtypes
|
||||
and df.fieldtype not in no_value_fields
|
||||
)
|
||||
|
||||
meta = frappe.get_meta(doctype)
|
||||
|
||||
# filter out layout fields
|
||||
fields = [df for df in meta.fields if is_exportable(df)]
|
||||
|
||||
if self.export_fields == "Mandatory":
|
||||
fields = [df for df in fields if df.reqd]
|
||||
|
||||
if self.export_fields == "All":
|
||||
fields = list(fields)
|
||||
|
||||
elif isinstance(self.export_fields, dict):
|
||||
whitelist = self.export_fields.get(doctype, [])
|
||||
fields = [df for df in fields if df.fieldname in whitelist]
|
||||
|
||||
name_field = frappe._dict(
|
||||
{
|
||||
"fieldtype": "Data",
|
||||
"fieldname": "name",
|
||||
"label": "ID",
|
||||
"reqd": 1,
|
||||
"parent": doctype,
|
||||
}
|
||||
)
|
||||
|
||||
if fields:
|
||||
return [name_field] + fields
|
||||
else:
|
||||
return []
|
||||
|
||||
def get_data_to_export(self):
|
||||
frappe.permissions.can_export(self.doctype, raise_exception=True)
|
||||
|
||||
def get_column_name(df):
|
||||
return "`tab{0}`.`{1}`".format(df.parent, df.fieldname)
|
||||
|
||||
fields = [get_column_name(df) for df in self.fields]
|
||||
filters = self.export_filters
|
||||
|
||||
if self.meta.is_nested_set():
|
||||
order_by = "`tab{0}`.`lft` ASC".format(self.doctype)
|
||||
else:
|
||||
order_by = "`tab{0}`.`creation` DESC".format(self.doctype)
|
||||
|
||||
data = frappe.db.get_list(
|
||||
self.doctype,
|
||||
filters=filters,
|
||||
fields=fields,
|
||||
limit_page_length=self.export_page_length,
|
||||
order_by=order_by,
|
||||
as_list=1,
|
||||
)
|
||||
|
||||
data = self.remove_duplicate_values(data)
|
||||
data = self.remove_row_gaps(data)
|
||||
data = self.remove_empty_rows(data)
|
||||
# data = self.remove_values_from_name_column(data)
|
||||
|
||||
return data
|
||||
|
||||
def remove_duplicate_values(self, data):
|
||||
out = []
|
||||
|
||||
doctypes = set([df.parent for df in self.fields])
|
||||
|
||||
def name_exists_in_column_before_row(name, column_index, row_index):
|
||||
column_values = [row[column_index] for i, row in enumerate(data) if i < row_index]
|
||||
return name in column_values
|
||||
|
||||
for i, row in enumerate(data):
|
||||
# first row is fine
|
||||
if i == 0:
|
||||
out.append(row)
|
||||
continue
|
||||
|
||||
row = list(row)
|
||||
for doctype in doctypes:
|
||||
name_index = self.get_name_column_index(doctype)
|
||||
name = row[name_index]
|
||||
column_indexes = self.get_column_indexes(doctype)
|
||||
|
||||
if name_exists_in_column_before_row(name, name_index, i):
|
||||
# remove the values from the row
|
||||
row = [None if i in column_indexes else d for i, d in enumerate(row)]
|
||||
|
||||
out.append(row)
|
||||
|
||||
return out
|
||||
|
||||
def remove_row_gaps(self, data):
|
||||
doctypes = set([df.parent for df in self.fields if df.parent != self.doctype])
|
||||
|
||||
def get_nearest_empty_row_index(col_index, row_index):
|
||||
col_values = [row[col_index] for row in data]
|
||||
i = row_index - 1
|
||||
while not col_values[i]:
|
||||
i = i - 1
|
||||
out = i + 1
|
||||
if row_index != out:
|
||||
return out
|
||||
|
||||
for i, row in enumerate(data):
|
||||
# if this is the row that contains parent values then skip
|
||||
if row[0]:
|
||||
continue
|
||||
|
||||
for doctype in doctypes:
|
||||
name_index = self.get_name_column_index(doctype)
|
||||
name = row[name_index]
|
||||
column_indexes = self.get_column_indexes(doctype)
|
||||
|
||||
if not name:
|
||||
continue
|
||||
|
||||
row_index = get_nearest_empty_row_index(name_index, i)
|
||||
if row_index:
|
||||
for col_index in column_indexes:
|
||||
data[row_index][col_index] = row[col_index]
|
||||
row[col_index] = None
|
||||
|
||||
return data
|
||||
|
||||
# pylint: disable=R0201
|
||||
def remove_empty_rows(self, data):
|
||||
return [row for row in data if any(v not in INVALID_VALUES for v in row)]
|
||||
|
||||
def remove_values_from_name_column(self, data):
|
||||
out = []
|
||||
name_columns = [i for i, df in enumerate(self.fields) if df.fieldname == "name"]
|
||||
for row in data:
|
||||
out.append(["" if i in name_columns else value for i, value in enumerate(row)])
|
||||
return out
|
||||
|
||||
def get_name_column_index(self, doctype):
|
||||
for i, df in enumerate(self.fields):
|
||||
if df.parent == doctype and df.fieldname == "name":
|
||||
return i
|
||||
return -1
|
||||
|
||||
def get_column_indexes(self, doctype):
|
||||
return [i for i, df in enumerate(self.fields) if df.parent == doctype]
|
||||
|
||||
def add_header(self):
|
||||
def get_label(df):
|
||||
if df.parent == self.doctype:
|
||||
return df.label
|
||||
else:
|
||||
return "{0} ({1})".format(df.label, df.parent)
|
||||
|
||||
header = [get_label(df) for df in self.fields]
|
||||
self.csv_array.append(header)
|
||||
|
||||
def add_data(self):
|
||||
self.csv_array += self.data
|
||||
|
||||
def get_csv_array(self):
|
||||
return self.csv_array
|
||||
|
||||
def get_csv_array_for_export(self):
|
||||
csv_array = self.csv_array
|
||||
|
||||
if not self.data:
|
||||
# add 2 empty rows
|
||||
csv_array += [[]] * 2
|
||||
|
||||
return csv_array
|
||||
|
||||
def build_response(self):
|
||||
if self.file_type == 'CSV':
|
||||
self.build_csv_response()
|
||||
elif self.file_type == 'Excel':
|
||||
self.build_xlsx_response()
|
||||
|
||||
def build_csv_response(self):
|
||||
build_csv_response(self.get_csv_array_for_export(), self.doctype)
|
||||
|
||||
def build_xlsx_response(self):
|
||||
build_xlsx_response(self.get_csv_array_for_export(), self.doctype)
|
||||
951
frappe/core/doctype/data_import/importer_new.py
Normal file
951
frappe/core/doctype/data_import/importer_new.py
Normal file
|
|
@ -0,0 +1,951 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies Pvt. Ltd. and Contributors
|
||||
# MIT License. See license.txt
|
||||
|
||||
import io
|
||||
import os
|
||||
import json
|
||||
import timeit
|
||||
import frappe
|
||||
from datetime import datetime
|
||||
from frappe import _
|
||||
from frappe.utils import cint, flt, update_progress_bar
|
||||
from frappe.utils.csvutils import read_csv_content
|
||||
from frappe.utils.xlsxutils import (
|
||||
read_xlsx_file_from_attached_file,
|
||||
read_xls_file_from_attached_file,
|
||||
)
|
||||
from frappe.model import no_value_fields, table_fields
|
||||
|
||||
INVALID_VALUES = ["", None]
|
||||
MAX_ROWS_IN_PREVIEW = 10
|
||||
|
||||
# pylint: disable=R0201
|
||||
class Importer:
|
||||
def __init__(
|
||||
self, doctype, data_import=None, file_path=None, content=None, console=False
|
||||
):
|
||||
self.doctype = doctype
|
||||
self.template_options = frappe._dict({"remap_column": {}})
|
||||
self.console = console
|
||||
|
||||
if data_import:
|
||||
self.data_import = data_import
|
||||
if self.data_import.template_options:
|
||||
template_options = frappe.parse_json(self.data_import.template_options)
|
||||
self.template_options.update(template_options)
|
||||
else:
|
||||
self.data_import = None
|
||||
|
||||
self.header_row = None
|
||||
self.data = None
|
||||
# used to store date formats guessed from data rows per column
|
||||
self._guessed_date_formats = {}
|
||||
# used to store eta during import
|
||||
self.last_eta = 0
|
||||
# used to collect warnings during template parsing
|
||||
# and show them to user
|
||||
self.warnings = []
|
||||
self.meta = frappe.get_meta(doctype)
|
||||
self.prepare_content(file_path, content)
|
||||
self.parse_data_from_template()
|
||||
|
||||
def prepare_content(self, file_path, content):
|
||||
extension = None
|
||||
if self.data_import and self.data_import.import_file:
|
||||
file_doc = frappe.get_doc("File", {"file_url": self.data_import.import_file})
|
||||
content = file_doc.get_content()
|
||||
extension = file_doc.file_name.split(".")[1]
|
||||
|
||||
if file_path:
|
||||
content, extension = self.read_file(file_path)
|
||||
|
||||
if not extension:
|
||||
extension = "csv"
|
||||
|
||||
if content:
|
||||
self.read_content(content, extension)
|
||||
|
||||
self.validate_template_content()
|
||||
self.remove_empty_rows_and_columns()
|
||||
|
||||
def read_file(self, file_path):
|
||||
extn = file_path.split(".")[1]
|
||||
|
||||
file_content = None
|
||||
with io.open(file_path, mode="rb") as f:
|
||||
file_content = f.read()
|
||||
|
||||
return file_content, extn
|
||||
|
||||
def read_content(self, content, extension):
|
||||
if extension == "csv":
|
||||
data = read_csv_content(content)
|
||||
elif extension == "xlsx":
|
||||
data = read_xlsx_file_from_attached_file(fcontent=content)
|
||||
elif extension == "xls":
|
||||
data = read_xls_file_from_attached_file(content)
|
||||
|
||||
self.header_row = data[0]
|
||||
self.data = data[1:]
|
||||
|
||||
def validate_template_content(self):
|
||||
column_count = len(self.header_row)
|
||||
if any([len(row) != column_count and len(row) != 0 for row in self.data]):
|
||||
frappe.throw(
|
||||
_("Number of columns does not match with data"), title=_("Invalid Template")
|
||||
)
|
||||
|
||||
def remove_empty_rows_and_columns(self):
|
||||
self.row_index_map = []
|
||||
removed_rows = []
|
||||
removed_columns = []
|
||||
|
||||
# remove empty rows
|
||||
data = []
|
||||
for i, row in enumerate(self.data):
|
||||
if all(v in INVALID_VALUES for v in row):
|
||||
# empty row
|
||||
removed_rows.append(i)
|
||||
else:
|
||||
data.append(row)
|
||||
self.row_index_map.append(i)
|
||||
|
||||
# remove empty columns
|
||||
# a column with a header and no data is a valid column
|
||||
# a column with no header and no data will be removed
|
||||
header_row = []
|
||||
for i, column in enumerate(self.header_row):
|
||||
column_values = [row[i] for row in data]
|
||||
values = [column] + column_values
|
||||
if all(v in INVALID_VALUES for v in values):
|
||||
# empty column
|
||||
removed_columns.append(i)
|
||||
else:
|
||||
header_row.append(column)
|
||||
|
||||
data_without_empty_columns = []
|
||||
# remove empty columns from data
|
||||
for i, row in enumerate(data):
|
||||
new_row = [v for j, v in enumerate(row) if j not in removed_columns]
|
||||
data_without_empty_columns.append(new_row)
|
||||
|
||||
self.data = data_without_empty_columns
|
||||
self.header_row = header_row
|
||||
|
||||
def get_data_for_import_preview(self):
|
||||
out = frappe._dict()
|
||||
out.data = list(self.rows)
|
||||
out.columns = self.columns
|
||||
out.warnings = self.warnings
|
||||
if len(out.data) > MAX_ROWS_IN_PREVIEW:
|
||||
out.data = out.data[:MAX_ROWS_IN_PREVIEW]
|
||||
out.max_rows_exceeded = True
|
||||
out.max_rows_in_preview = MAX_ROWS_IN_PREVIEW
|
||||
return out
|
||||
|
||||
def parse_data_from_template(self):
|
||||
columns = self.parse_columns_from_header_row()
|
||||
columns, data = self.add_serial_no_column(columns, self.data)
|
||||
|
||||
self.columns = columns
|
||||
self.rows = data
|
||||
|
||||
def parse_columns_from_header_row(self):
|
||||
remap_column = self.template_options.remap_column
|
||||
columns = []
|
||||
|
||||
df_by_labels_and_fieldnames = self.build_fields_dict_for_column_matching()
|
||||
|
||||
for i, header_title in enumerate(self.header_row):
|
||||
header_row_index = str(i)
|
||||
column_number = str(i + 1)
|
||||
skip_import = False
|
||||
fieldname = remap_column.get(header_row_index)
|
||||
|
||||
if fieldname and fieldname != "Don't Import":
|
||||
df = df_by_labels_and_fieldnames.get(fieldname)
|
||||
self.warnings.append(
|
||||
{
|
||||
"col": column_number,
|
||||
"message": _("Mapping column {0} to field {1}").format(
|
||||
frappe.bold(header_title or "<i>Untitled Column</i>"), frappe.bold(df.label)
|
||||
),
|
||||
"type": "info",
|
||||
}
|
||||
)
|
||||
else:
|
||||
df = df_by_labels_and_fieldnames.get(header_title)
|
||||
|
||||
if not df:
|
||||
skip_import = True
|
||||
else:
|
||||
skip_import = False
|
||||
|
||||
if fieldname == "Don't Import":
|
||||
skip_import = True
|
||||
self.warnings.append(
|
||||
{
|
||||
"col": column_number,
|
||||
"message": _("Skipping column {0}").format(frappe.bold(header_title)),
|
||||
"type": "info",
|
||||
}
|
||||
)
|
||||
elif header_title and not df:
|
||||
self.warnings.append(
|
||||
{
|
||||
"col": column_number,
|
||||
"message": _("Cannot match column {0} with any field").format(
|
||||
frappe.bold(header_title)
|
||||
),
|
||||
"type": "info",
|
||||
}
|
||||
)
|
||||
elif not header_title and not df:
|
||||
self.warnings.append(
|
||||
{"col": column_number, "message": _("Skipping Untitled Column"), "type": "info"}
|
||||
)
|
||||
|
||||
columns.append(
|
||||
frappe._dict(
|
||||
df=df,
|
||||
skip_import=skip_import,
|
||||
header_title=header_title,
|
||||
column_number=column_number,
|
||||
index=i,
|
||||
)
|
||||
)
|
||||
|
||||
return columns
|
||||
|
||||
def build_fields_dict_for_column_matching(self):
|
||||
"""
|
||||
Build a dict with various keys to match with column headers and value as docfield
|
||||
The keys can be label or fieldname
|
||||
{
|
||||
'Customer': df1,
|
||||
'customer': df1,
|
||||
'Due Date': df2,
|
||||
'due_date': df2,
|
||||
'Item Code (Sales Invoice Item)': df3,
|
||||
'Sales Invoice Item:item_code': df3,
|
||||
}
|
||||
"""
|
||||
out = {}
|
||||
|
||||
table_doctypes = [df.options for df in self.meta.get_table_fields()]
|
||||
doctypes = table_doctypes + [self.doctype]
|
||||
for doctype in doctypes:
|
||||
# name field
|
||||
name_key = "ID" if self.doctype == doctype else "ID ({})".format(doctype)
|
||||
name_df = frappe._dict(
|
||||
{
|
||||
"fieldtype": "Data",
|
||||
"fieldname": "name",
|
||||
"label": "ID",
|
||||
"reqd": self.data_import.import_type == "Update Existing Records",
|
||||
"parent": doctype,
|
||||
}
|
||||
)
|
||||
out[name_key] = name_df
|
||||
out["name"] = name_df
|
||||
|
||||
# other fields
|
||||
meta = frappe.get_meta(doctype)
|
||||
fields = self.get_standard_fields(doctype) + meta.fields
|
||||
for df in fields:
|
||||
fieldtype = df.fieldtype or "Data"
|
||||
parent = df.parent or self.doctype
|
||||
if fieldtype not in no_value_fields:
|
||||
# label as key
|
||||
label = (
|
||||
df.label if self.doctype == doctype else "{0} ({1})".format(df.label, parent)
|
||||
)
|
||||
out[label] = df
|
||||
# fieldname as key
|
||||
if self.doctype == doctype:
|
||||
out[df.fieldname] = df
|
||||
else:
|
||||
key = "{0}:{1}".format(doctype, df.fieldname)
|
||||
out[key] = df
|
||||
|
||||
# if autoname is based on field
|
||||
# add an entry for "ID (Autoname Field)"
|
||||
autoname_field = self.get_autoname_field(self.doctype)
|
||||
if autoname_field:
|
||||
out["ID ({})".format(autoname_field.label)] = autoname_field
|
||||
# ID field should also map to the autoname field
|
||||
out["ID"] = autoname_field
|
||||
out["name"] = autoname_field
|
||||
|
||||
return out
|
||||
|
||||
def get_standard_fields(self, doctype):
|
||||
meta = frappe.get_meta(doctype)
|
||||
if meta.istable:
|
||||
standard_fields = [
|
||||
{"label": "Parent", "fieldname": "parent"},
|
||||
{"label": "Parent Type", "fieldname": "parenttype"},
|
||||
{"label": "Parent Field", "fieldname": "parentfield"},
|
||||
{"label": "Row Index", "fieldname": "idx"},
|
||||
]
|
||||
else:
|
||||
standard_fields = [
|
||||
{"label": "Owner", "fieldname": "owner"},
|
||||
{"label": "Document Status", "fieldname": "docstatus", "fieldtype": "Int"},
|
||||
]
|
||||
|
||||
out = []
|
||||
for df in standard_fields:
|
||||
df = frappe._dict(df)
|
||||
df.parent = doctype
|
||||
out.append(df)
|
||||
return out
|
||||
|
||||
def add_serial_no_column(self, columns, data):
|
||||
columns_with_serial_no = [
|
||||
frappe._dict({"header_title": "Sr. No", "skip_import": True})
|
||||
] + columns
|
||||
|
||||
# update index for each column
|
||||
for i, col in enumerate(columns_with_serial_no):
|
||||
col.index = i
|
||||
|
||||
data_with_serial_no = []
|
||||
for i, row in enumerate(data):
|
||||
data_with_serial_no.append([self.row_index_map[i] + 1] + row)
|
||||
|
||||
return columns_with_serial_no, data_with_serial_no
|
||||
|
||||
def parse_value(self, value, df):
|
||||
# convert boolean values to 0 or 1
|
||||
if df.fieldtype == "Check" and value.lower().strip() in ["t", "f", "true", "false"]:
|
||||
value = value.lower().strip()
|
||||
value = 1 if value in ["t", "true"] else 0
|
||||
|
||||
if df.fieldtype in ["Int", "Check"]:
|
||||
value = cint(value)
|
||||
elif df.fieldtype in ["Float", "Percent", "Currency"]:
|
||||
value = flt(value)
|
||||
elif df.fieldtype in ["Date", "Datetime"]:
|
||||
value = self.parse_date_format(value, df)
|
||||
|
||||
return value
|
||||
|
||||
def parse_date_format(self, value, df):
|
||||
date_format = self.guess_date_format_for_column(df.fieldname)
|
||||
if date_format:
|
||||
return datetime.strptime(value, date_format)
|
||||
return value
|
||||
|
||||
def guess_date_format_for_column(self, fieldname):
|
||||
""" Guesses date format for a column by parsing the first 10 values in the column,
|
||||
getting the date format and then returning the one which has the maximum frequency
|
||||
"""
|
||||
PARSE_ROW_COUNT = 10
|
||||
|
||||
if not self._guessed_date_formats.get(fieldname):
|
||||
column_index = -1
|
||||
|
||||
for i, field in enumerate(self.header_row):
|
||||
if self.meta.has_field(field) and field == fieldname:
|
||||
column_index = i
|
||||
break
|
||||
|
||||
if column_index == -1:
|
||||
self._guessed_date_formats[fieldname] = None
|
||||
|
||||
date_values = [
|
||||
row[column_index] for row in self.data[:PARSE_ROW_COUNT] if row[column_index]
|
||||
]
|
||||
date_formats = [guess_date_format(d) for d in date_values]
|
||||
if not date_formats:
|
||||
return
|
||||
max_occurred_date_format = max(set(date_formats), key=date_formats.count)
|
||||
self._guessed_date_formats[fieldname] = max_occurred_date_format
|
||||
|
||||
return self._guessed_date_formats[fieldname]
|
||||
|
||||
def import_data(self):
|
||||
# set user lang for translations
|
||||
frappe.cache().hdel("lang", frappe.session.user)
|
||||
frappe.set_user_lang(frappe.session.user)
|
||||
|
||||
if not self.console:
|
||||
self.data_import.db_set("template_warnings", "")
|
||||
|
||||
# set flags
|
||||
frappe.flags.in_import = True
|
||||
frappe.flags.mute_emails = self.data_import.mute_emails
|
||||
|
||||
# prepare a map for missing link field values
|
||||
self.prepare_missing_link_field_values()
|
||||
|
||||
# parse docs from rows
|
||||
payloads = self.get_payloads_for_import()
|
||||
|
||||
# dont import if there are non-ignorable warnings
|
||||
warnings = [w for w in self.warnings if w.get("type") != "info"]
|
||||
if warnings:
|
||||
if self.console:
|
||||
self.print_grouped_warnings(warnings)
|
||||
else:
|
||||
self.data_import.db_set("template_warnings", json.dumps(warnings))
|
||||
frappe.publish_realtime(
|
||||
"data_import_refresh", {"data_import": self.data_import.name}
|
||||
)
|
||||
return
|
||||
|
||||
# setup import log
|
||||
if self.data_import.import_log:
|
||||
import_log = frappe.parse_json(self.data_import.import_log)
|
||||
else:
|
||||
import_log = []
|
||||
|
||||
# remove previous failures from import log
|
||||
import_log = [l for l in import_log if l.get("success") == True]
|
||||
|
||||
# get successfully imported rows
|
||||
imported_rows = []
|
||||
for log in import_log:
|
||||
log = frappe._dict(log)
|
||||
if log.success:
|
||||
imported_rows += log.row_indexes
|
||||
|
||||
# start import
|
||||
total_payload_count = len(payloads)
|
||||
batch_size = frappe.conf.data_import_batch_size or 1000
|
||||
|
||||
for batch_index, batched_payloads in enumerate(
|
||||
frappe.utils.create_batch(payloads, batch_size)
|
||||
):
|
||||
for i, payload in enumerate(batched_payloads):
|
||||
doc = payload.doc
|
||||
row_indexes = [row[0] for row in payload.rows]
|
||||
current_index = (i + 1) + (batch_index * batch_size)
|
||||
|
||||
if set(row_indexes).intersection(set(imported_rows)):
|
||||
print("Skipping imported rows", row_indexes)
|
||||
if total_payload_count > 5:
|
||||
frappe.publish_realtime(
|
||||
"data_import_progress",
|
||||
{
|
||||
"current": current_index,
|
||||
"total": total_payload_count,
|
||||
"skipping": True,
|
||||
"data_import": self.data_import.name,
|
||||
},
|
||||
)
|
||||
continue
|
||||
|
||||
try:
|
||||
start = timeit.default_timer()
|
||||
doc = self.process_doc(doc)
|
||||
processing_time = timeit.default_timer() - start
|
||||
eta = self.get_eta(current_index, total_payload_count, processing_time)
|
||||
|
||||
if total_payload_count > 5:
|
||||
frappe.publish_realtime(
|
||||
"data_import_progress",
|
||||
{
|
||||
"current": current_index,
|
||||
"total": total_payload_count,
|
||||
"docname": doc.name,
|
||||
"data_import": self.data_import.name,
|
||||
"success": True,
|
||||
"row_indexes": row_indexes,
|
||||
"eta": eta,
|
||||
},
|
||||
)
|
||||
if self.console:
|
||||
update_progress_bar(
|
||||
"Importing {0} records".format(total_payload_count),
|
||||
current_index,
|
||||
total_payload_count,
|
||||
)
|
||||
import_log.append(
|
||||
frappe._dict(success=True, docname=doc.name, row_indexes=row_indexes)
|
||||
)
|
||||
# commit after every successful import
|
||||
frappe.db.commit()
|
||||
|
||||
except Exception:
|
||||
import_log.append(
|
||||
frappe._dict(
|
||||
success=False,
|
||||
exception=frappe.get_traceback(),
|
||||
messages=frappe.local.message_log,
|
||||
row_indexes=row_indexes,
|
||||
)
|
||||
)
|
||||
frappe.clear_messages()
|
||||
# rollback if exception
|
||||
frappe.db.rollback()
|
||||
|
||||
# set status
|
||||
failures = [l for l in import_log if l.get("success") == False]
|
||||
if len(failures) == total_payload_count:
|
||||
status = "Pending"
|
||||
elif len(failures) > 0:
|
||||
status = "Partial Success"
|
||||
else:
|
||||
status = "Success"
|
||||
|
||||
if self.console:
|
||||
self.print_import_log(import_log)
|
||||
else:
|
||||
self.data_import.db_set("status", status)
|
||||
self.data_import.db_set("import_log", json.dumps(import_log))
|
||||
|
||||
frappe.flags.in_import = False
|
||||
frappe.flags.mute_emails = False
|
||||
frappe.publish_realtime("data_import_refresh", {"data_import": self.data_import.name})
|
||||
|
||||
return import_log
|
||||
|
||||
def get_payloads_for_import(self):
|
||||
payloads = []
|
||||
# make a copy
|
||||
data = list(self.rows)
|
||||
while data:
|
||||
doc, rows, data = self.parse_next_row_for_import(data)
|
||||
payloads.append(frappe._dict(doc=doc, rows=rows))
|
||||
return payloads
|
||||
|
||||
def parse_next_row_for_import(self, data):
|
||||
"""
|
||||
Parses rows that make up a doc. A doc maybe built from a single row or multiple rows.
|
||||
Returns the doc, rows, and data without the rows.
|
||||
"""
|
||||
doctypes = set([col.df.parent for col in self.columns if col.df and col.df.parent])
|
||||
|
||||
# first row is included by default
|
||||
first_row = data[0]
|
||||
rows = [first_row]
|
||||
|
||||
# if there are child doctypes, find the subsequent rows
|
||||
if len(doctypes) > 1:
|
||||
# subsequent rows either dont have any parent value set
|
||||
# or have the same value as the parent row
|
||||
# we include a row if either of conditions match
|
||||
parent_column_indexes = [
|
||||
col.index
|
||||
for col in self.columns
|
||||
if not col.skip_import and col.df and col.df.parent == self.doctype
|
||||
]
|
||||
parent_row_values = [first_row[i] for i in parent_column_indexes]
|
||||
|
||||
data_without_first_row = data[1:]
|
||||
for row in data_without_first_row:
|
||||
row_values = [row[i] for i in parent_column_indexes]
|
||||
# if the row is blank, it's a child row doc
|
||||
if all([v in INVALID_VALUES for v in row_values]):
|
||||
rows.append(row)
|
||||
continue
|
||||
# if the row has same values as parent row, it's a child row doc
|
||||
if row_values == parent_row_values:
|
||||
rows.append(row)
|
||||
continue
|
||||
# if any of those conditions dont match, it's the next doc
|
||||
break
|
||||
|
||||
def get_column_indexes(doctype):
|
||||
return [
|
||||
col.index
|
||||
for col in self.columns
|
||||
if not col.skip_import and col.df and col.df.parent == doctype
|
||||
]
|
||||
|
||||
def validate_value(value, df):
|
||||
if df.fieldtype == "Select":
|
||||
select_options = df.get_select_options()
|
||||
if select_options and value not in select_options:
|
||||
options_string = ", ".join([frappe.bold(d) for d in select_options])
|
||||
msg = _("Value must be one of {0}").format(options_string)
|
||||
self.warnings.append(
|
||||
{
|
||||
"row": row_number,
|
||||
"field": df.as_dict(convert_dates_to_str=True),
|
||||
"message": msg,
|
||||
}
|
||||
)
|
||||
return False
|
||||
|
||||
elif df.fieldtype == "Link":
|
||||
d = self.get_missing_link_field_values(df.options)
|
||||
if value in d.missing_values and not d.one_mandatory:
|
||||
msg = _("Value {0} missing for {1}").format(
|
||||
frappe.bold(value), frappe.bold(df.options)
|
||||
)
|
||||
self.warnings.append(
|
||||
{
|
||||
"row": row_number,
|
||||
"field": df.as_dict(convert_dates_to_str=True),
|
||||
"message": msg,
|
||||
}
|
||||
)
|
||||
return value
|
||||
|
||||
return value
|
||||
|
||||
def parse_doc(doctype, docfields, values, row_number):
|
||||
# new_doc returns a dict with default values set
|
||||
doc = frappe.new_doc(doctype, as_dict=True)
|
||||
# remove standard fields and __islocal
|
||||
for key in frappe.model.default_fields + ("__islocal",):
|
||||
doc.pop(key, None)
|
||||
|
||||
for df, value in zip(docfields, values):
|
||||
if value in INVALID_VALUES:
|
||||
value = None
|
||||
|
||||
value = validate_value(value, df)
|
||||
if value:
|
||||
doc[df.fieldname] = self.parse_value(value, df)
|
||||
|
||||
check_mandatory_fields(doctype, doc, row_number)
|
||||
return doc
|
||||
|
||||
def check_mandatory_fields(doctype, doc, row_number):
|
||||
# check if mandatory fields are set (except table fields)
|
||||
meta = frappe.get_meta(doctype)
|
||||
fields = [
|
||||
df
|
||||
for df in meta.fields
|
||||
if df.fieldtype not in table_fields
|
||||
and df.reqd
|
||||
and doc.get(df.fieldname) in INVALID_VALUES
|
||||
]
|
||||
|
||||
if not fields:
|
||||
return
|
||||
|
||||
if len(fields) == 1:
|
||||
self.warnings.append(
|
||||
{
|
||||
"row": row_number,
|
||||
"message": _("{0} is a mandatory field").format(fields[0].label),
|
||||
}
|
||||
)
|
||||
else:
|
||||
fields_string = ", ".join([df.label for df in fields])
|
||||
self.warnings.append(
|
||||
{"row": row_number, "message": _("{0} are mandatory fields").format(fields_string)}
|
||||
)
|
||||
|
||||
parsed_docs = {}
|
||||
for row in rows:
|
||||
for doctype in doctypes:
|
||||
if doctype == self.doctype and parsed_docs.get(doctype):
|
||||
# if parent doc is already parsed from the first row
|
||||
# then skip
|
||||
continue
|
||||
|
||||
row_number = row[0]
|
||||
column_indexes = get_column_indexes(doctype)
|
||||
values = [row[i] for i in column_indexes]
|
||||
|
||||
if all(v in INVALID_VALUES for v in values):
|
||||
# skip values if all of them are empty
|
||||
continue
|
||||
|
||||
columns = [self.columns[i] for i in column_indexes]
|
||||
docfields = [col.df for col in columns]
|
||||
doc = parse_doc(doctype, docfields, values, row_number)
|
||||
parsed_docs[doctype] = parsed_docs.get(doctype, [])
|
||||
parsed_docs[doctype].append(doc)
|
||||
|
||||
# build the doc with children
|
||||
doc = {}
|
||||
for doctype, docs in parsed_docs.items():
|
||||
if doctype == self.doctype:
|
||||
doc.update(docs[0])
|
||||
else:
|
||||
table_dfs = self.meta.get(
|
||||
"fields", {"options": doctype, "fieldtype": ["in", table_fields]}
|
||||
)
|
||||
if table_dfs:
|
||||
table_field = table_dfs[0]
|
||||
doc[table_field.fieldname] = docs
|
||||
|
||||
# check if there is atleast one row for mandatory table fields
|
||||
mandatory_table_fields = [
|
||||
df
|
||||
for df in self.meta.fields
|
||||
if df.fieldtype in table_fields and df.reqd and len(doc.get(df.fieldname, [])) == 0
|
||||
]
|
||||
if len(mandatory_table_fields) == 1:
|
||||
self.warnings.append(
|
||||
{
|
||||
"row": first_row[0],
|
||||
"message": _("There should be atleast one row for {0} table").format(
|
||||
mandatory_table_fields[0].label
|
||||
),
|
||||
}
|
||||
)
|
||||
elif mandatory_table_fields:
|
||||
fields_string = ", ".join([df.label for df in mandatory_table_fields])
|
||||
self.warnings.append(
|
||||
{
|
||||
"row": first_row[0],
|
||||
"message": _("There should be atleast one row for the following tables: {0}").format(fields_string),
|
||||
}
|
||||
)
|
||||
|
||||
return doc, rows, data[len(rows) :]
|
||||
|
||||
def process_doc(self, doc):
|
||||
import_type = self.data_import.import_type
|
||||
|
||||
if import_type == "Insert New Records":
|
||||
return self.insert_record(doc)
|
||||
elif import_type == "Update Existing Records":
|
||||
return self.update_record(doc)
|
||||
|
||||
def insert_record(self, doc):
|
||||
self.create_missing_linked_records(doc)
|
||||
|
||||
new_doc = frappe.new_doc(self.doctype)
|
||||
new_doc.update(doc)
|
||||
# name shouldn't be set when inserting a new record
|
||||
new_doc.set("name", None)
|
||||
new_doc.insert()
|
||||
if self.meta.is_submittable and self.data_import.submit_after_import:
|
||||
new_doc.submit()
|
||||
return new_doc
|
||||
|
||||
def create_missing_linked_records(self, doc):
|
||||
"""
|
||||
Finds fields that are of type Link, and creates the corresponding
|
||||
document automatically if it has only one mandatory field
|
||||
"""
|
||||
link_values = []
|
||||
|
||||
def get_link_fields(doc, doctype):
|
||||
for fieldname, value in doc.items():
|
||||
meta = frappe.get_meta(doctype)
|
||||
df = meta.get_field(fieldname)
|
||||
if not df:
|
||||
continue
|
||||
if df.fieldtype == "Link" and value not in INVALID_VALUES:
|
||||
link_values.append([df.options, value])
|
||||
elif df.fieldtype in table_fields:
|
||||
for row in value:
|
||||
get_link_fields(row, df.options)
|
||||
|
||||
get_link_fields(doc, self.doctype)
|
||||
|
||||
for link_doctype, link_value in link_values:
|
||||
d = self.missing_link_values.get(link_doctype)
|
||||
if d and d.one_mandatory and link_value in d.missing_values:
|
||||
# find the autoname field
|
||||
autoname_field = self.get_autoname_field(link_doctype)
|
||||
name_field = autoname_field.fieldname if autoname_field else "name"
|
||||
new_doc = frappe.new_doc(link_doctype)
|
||||
new_doc.set(name_field, link_value)
|
||||
new_doc.insert()
|
||||
d.missing_values.remove(link_value)
|
||||
|
||||
def update_record(self, doc):
|
||||
id_fieldname = self.get_id_fieldname()
|
||||
id_value = doc[id_fieldname]
|
||||
existing_doc = frappe.get_doc(self.doctype, id_value)
|
||||
existing_doc.flags.via_data_import = self.data_import.name
|
||||
existing_doc.update(doc)
|
||||
existing_doc.save()
|
||||
return existing_doc
|
||||
|
||||
def export_errored_rows(self):
|
||||
from frappe.utils.csvutils import build_csv_response
|
||||
|
||||
if not self.data_import:
|
||||
return
|
||||
|
||||
import_log = frappe.parse_json(self.data_import.import_log or "[]")
|
||||
failures = [l for l in import_log if l.get("success") == False]
|
||||
row_indexes = []
|
||||
for f in failures:
|
||||
row_indexes.extend(f.get("row_indexes", []))
|
||||
|
||||
# de duplicate
|
||||
row_indexes = list(set(row_indexes))
|
||||
row_indexes.sort()
|
||||
|
||||
header_row = [col.header_title for col in self.columns[1:]]
|
||||
rows = [header_row]
|
||||
rows += [row[1:] for row in self.rows if row[0] in row_indexes]
|
||||
|
||||
build_csv_response(rows, self.doctype)
|
||||
|
||||
def get_missing_link_field_values(self, doctype):
|
||||
return self.missing_link_values.get(doctype, {})
|
||||
|
||||
def prepare_missing_link_field_values(self):
|
||||
columns = self.columns
|
||||
rows = self.rows
|
||||
link_column_indexes = [
|
||||
col.index for col in columns if col.df and col.df.fieldtype == "Link"
|
||||
]
|
||||
|
||||
self.missing_link_values = {}
|
||||
for index in link_column_indexes:
|
||||
col = columns[index]
|
||||
column_values = [row[index] for row in rows]
|
||||
values = set([v for v in column_values if v not in INVALID_VALUES])
|
||||
doctype = col.df.options
|
||||
|
||||
missing_values = [value for value in values if not frappe.db.exists(doctype, value)]
|
||||
if self.missing_link_values.get(doctype):
|
||||
self.missing_link_values[doctype].missing_values += missing_values
|
||||
else:
|
||||
self.missing_link_values[doctype] = frappe._dict(
|
||||
missing_values=missing_values,
|
||||
one_mandatory=self.has_one_mandatory_field(doctype),
|
||||
df=col.df,
|
||||
)
|
||||
|
||||
def get_id_fieldname(self):
|
||||
autoname_field = self.get_autoname_field(self.doctype)
|
||||
if autoname_field:
|
||||
return autoname_field.fieldname
|
||||
return "name"
|
||||
|
||||
def get_eta(self, current, total, processing_time):
|
||||
remaining = total - current
|
||||
eta = processing_time * remaining
|
||||
if not self.last_eta or eta < self.last_eta:
|
||||
self.last_eta = eta
|
||||
return self.last_eta
|
||||
|
||||
def has_one_mandatory_field(self, doctype):
|
||||
meta = frappe.get_meta(doctype)
|
||||
# get mandatory fields with default not set
|
||||
mandatory_fields = [df for df in meta.fields if df.reqd and not df.default]
|
||||
mandatory_fields_count = len(mandatory_fields)
|
||||
if meta.autoname and meta.autoname.lower() == "prompt":
|
||||
mandatory_fields_count += 1
|
||||
return mandatory_fields_count == 1
|
||||
|
||||
def get_autoname_field(self, doctype):
|
||||
meta = frappe.get_meta(doctype)
|
||||
if meta.autoname and meta.autoname.startswith("field:"):
|
||||
fieldname = meta.autoname[len("field:") :]
|
||||
return meta.get_field(fieldname)
|
||||
|
||||
def print_grouped_warnings(self, warnings):
|
||||
warnings_by_row = {}
|
||||
other_warnings = []
|
||||
for w in warnings:
|
||||
if w.get("row"):
|
||||
warnings_by_row.setdefault(w.get("row"), []).append(w)
|
||||
else:
|
||||
other_warnings.append(w)
|
||||
|
||||
for row_number, warnings in warnings_by_row.items():
|
||||
print("Row {0}".format(row_number))
|
||||
for w in warnings:
|
||||
print(w.get("message"))
|
||||
|
||||
for w in other_warnings:
|
||||
print(w.get("message"))
|
||||
|
||||
def print_import_log(self, import_log):
|
||||
failed_records = [l for l in import_log if not l.success]
|
||||
successful_records = [l for l in import_log if l.success]
|
||||
|
||||
if successful_records:
|
||||
print(
|
||||
"Successfully imported {0} records out of {1}".format(
|
||||
len(successful_records), len(import_log)
|
||||
)
|
||||
)
|
||||
|
||||
if failed_records:
|
||||
print("Failed to import {0} records".format(len(failed_records)))
|
||||
file_name = '{0}_import_on_{1}.txt'.format(self.doctype, frappe.utils.now())
|
||||
print('Check {0} for errors'.format(os.path.join('sites', file_name)))
|
||||
text = ""
|
||||
for w in failed_records:
|
||||
text += "Row Indexes: {0}\n".format(str(w.get('row_indexes', [])))
|
||||
text += "Messages:\n{0}\n".format('\n'.join(w.get('messages', [])))
|
||||
text += "Traceback:\n{0}\n\n".format(w.get('exception'))
|
||||
|
||||
with open(file_name, 'w') as f:
|
||||
f.write(text)
|
||||
|
||||
|
||||
DATE_FORMATS = [
|
||||
r"%d-%m-%Y",
|
||||
r"%m-%d-%Y",
|
||||
r"%Y-%m-%d",
|
||||
r"%d-%m-%y",
|
||||
r"%m-%d-%y",
|
||||
r"%y-%m-%d",
|
||||
r"%d/%m/%Y",
|
||||
r"%m/%d/%Y",
|
||||
r"%Y/%m/%d",
|
||||
r"%d/%m/%y",
|
||||
r"%m/%d/%y",
|
||||
r"%y/%m/%d",
|
||||
r"%d.%m.%Y",
|
||||
r"%m.%d.%Y",
|
||||
r"%Y.%m.%d",
|
||||
r"%d.%m.%y",
|
||||
r"%m.%d.%y",
|
||||
r"%y.%m.%d",
|
||||
]
|
||||
|
||||
TIME_FORMATS = [
|
||||
r"%H:%M:%S.%f",
|
||||
r"%H:%M:%S",
|
||||
r"%H:%M",
|
||||
r"%I:%M:%S.%f %p",
|
||||
r"%I:%M:%S %p",
|
||||
r"%I:%M %p",
|
||||
]
|
||||
|
||||
|
||||
def guess_date_format(date_string):
|
||||
date_string = date_string.strip()
|
||||
|
||||
_date = None
|
||||
_time = None
|
||||
|
||||
if " " in date_string:
|
||||
_date, _time = date_string.split(" ", 1)
|
||||
else:
|
||||
_date = date_string
|
||||
|
||||
date_format = None
|
||||
time_format = None
|
||||
|
||||
for f in DATE_FORMATS:
|
||||
try:
|
||||
# if date is parsed without any exception
|
||||
# capture the date format
|
||||
datetime.strptime(_date, f)
|
||||
date_format = f
|
||||
break
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
if _time:
|
||||
for f in TIME_FORMATS:
|
||||
try:
|
||||
# if time is parsed without any exception
|
||||
# capture the time format
|
||||
datetime.strptime(_time, f)
|
||||
time_format = f
|
||||
break
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
full_format = date_format
|
||||
if time_format:
|
||||
full_format += " " + time_format
|
||||
return full_format
|
||||
|
||||
|
||||
def import_data(doctype, file_path):
|
||||
i = Importer(doctype, file_path)
|
||||
i.import_data()
|
||||
40
frappe/core/doctype/data_import/test_exporter_new.py
Normal file
40
frappe/core/doctype/data_import/test_exporter_new.py
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and Contributors
|
||||
# See license.txt
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
import frappe
|
||||
from frappe.core.doctype.data_import.exporter_new import Exporter
|
||||
|
||||
|
||||
class TestExporter(unittest.TestCase):
|
||||
def test_exports_mandatory_fields(self):
|
||||
e = Exporter('Web Page', export_fields='Mandatory')
|
||||
csv_array = e.get_csv_array()
|
||||
header_row = csv_array[0]
|
||||
self.assertEqual(header_row, ['ID', 'Title'])
|
||||
|
||||
|
||||
def test_exports_all_fields(self):
|
||||
e = Exporter('Web Page', export_fields='All')
|
||||
csv_array = e.get_csv_array()
|
||||
header = csv_array[0]
|
||||
self.assertEqual(len(header), 23)
|
||||
|
||||
|
||||
def test_exports_selected_fields(self):
|
||||
export_fields = {
|
||||
'Web Page': ['title', 'route', 'published']
|
||||
}
|
||||
e = Exporter('Web Page', export_fields=export_fields)
|
||||
csv_array = e.get_csv_array()
|
||||
header = csv_array[0]
|
||||
self.assertEqual(header, ['ID', 'Title', 'Route', 'Published'])
|
||||
|
||||
|
||||
def test_exports_data(self):
|
||||
e = Exporter('ToDo', export_fields='All', export_data=True)
|
||||
todo_records = frappe.db.count('ToDo')
|
||||
csv_array = e.get_csv_array()
|
||||
self.assertEqual(len(csv_array), todo_records + 1)
|
||||
78
frappe/core/doctype/data_import/test_importer_new.py
Normal file
78
frappe/core/doctype/data_import/test_importer_new.py
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and Contributors
|
||||
# See license.txt
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import datetime
|
||||
import unittest
|
||||
import frappe
|
||||
from frappe.core.doctype.data_import.importer_new import Importer
|
||||
|
||||
content_empty_rows = '''title,start_date,idx,show_title
|
||||
,,,
|
||||
est phasellus sit amet,5/20/2019,52,1
|
||||
nibh in,7/29/2019,77,1
|
||||
'''
|
||||
|
||||
content_mandatory_missing = '''title,start_date,idx,show_title
|
||||
,5/20/2019,52,1
|
||||
'''
|
||||
|
||||
content_convert_value = '''title,start_date,idx,show_title
|
||||
est phasellus sit amet,5/20/2019,52,True
|
||||
'''
|
||||
|
||||
content_invalid_column = '''title,start_date,idx,show_title,invalid_column
|
||||
est phasellus sit amet,5/20/2019,52,True,invalid value
|
||||
'''
|
||||
|
||||
|
||||
class TestImporter(unittest.TestCase):
|
||||
def test_should_skip_empty_rows(self):
|
||||
i = self.get_importer('Web Page', content=content_empty_rows)
|
||||
payloads = i.get_payloads_for_import()
|
||||
row_to_be_imported = []
|
||||
for p in payloads:
|
||||
row_to_be_imported += [row[0] for row in p.rows]
|
||||
self.assertEqual(len(row_to_be_imported), 2)
|
||||
|
||||
def test_should_throw_if_mandatory_is_missing(self):
|
||||
i = self.get_importer('Web Page', content=content_mandatory_missing)
|
||||
i.import_data()
|
||||
warning = i.warnings[0]
|
||||
self.assertTrue('Title is a mandatory field' in warning['message'])
|
||||
|
||||
def test_should_convert_value_based_on_fieldtype(self):
|
||||
i = self.get_importer('Web Page', content=content_convert_value)
|
||||
payloads = i.get_payloads_for_import()
|
||||
doc = payloads[0].doc
|
||||
|
||||
self.assertEqual(type(doc['show_title']), int)
|
||||
self.assertEqual(type(doc['idx']), int)
|
||||
self.assertEqual(type(doc['start_date']), datetime.datetime)
|
||||
|
||||
def test_should_ignore_invalid_columns(self):
|
||||
i = self.get_importer('Web Page', content=content_invalid_column)
|
||||
payloads = i.get_payloads_for_import()
|
||||
doc = payloads[0].doc
|
||||
|
||||
self.assertTrue('invalid_column' not in doc)
|
||||
self.assertTrue('title' in doc)
|
||||
|
||||
def test_should_import_valid_template(self):
|
||||
title = 'est phasellus sit amet {0}'.format(frappe.utils.random_string(8))
|
||||
content_valid_content = '''title,start_date,idx,show_title
|
||||
{0},5/20/2019,52,1'''.format(title)
|
||||
i = self.get_importer('Web Page', content=content_valid_content)
|
||||
import_log = i.import_data()
|
||||
log = import_log[0]
|
||||
self.assertTrue(log.success)
|
||||
doc = frappe.get_doc('Web Page', { 'title': title })
|
||||
self.assertEqual(frappe.utils.get_datetime_str(doc.start_date),
|
||||
frappe.utils.get_datetime_str('2019-05-20'))
|
||||
|
||||
def get_importer(self, doctype, content):
|
||||
data_import = frappe.new_doc('Data Import Beta')
|
||||
data_import.import_type = 'Insert New Records'
|
||||
i = Importer(doctype, content=content, data_import=data_import)
|
||||
return i
|
||||
403
frappe/core/doctype/data_import_beta/data_import_beta.js
Normal file
403
frappe/core/doctype/data_import_beta/data_import_beta.js
Normal file
|
|
@ -0,0 +1,403 @@
|
|||
// Copyright (c) 2019, Frappe Technologies and contributors
|
||||
// For license information, please see license.txt
|
||||
|
||||
frappe.ui.form.on('Data Import Beta', {
|
||||
setup(frm) {
|
||||
frappe.realtime.on('data_import_refresh', ({ data_import }) => {
|
||||
if (data_import !== frm.doc.name) return;
|
||||
frappe.model.clear_doc('Data Import Beta', frm.doc.name);
|
||||
frappe.model.with_doc('Data Import Beta', frm.doc.name).then(() => {
|
||||
frm.refresh();
|
||||
});
|
||||
});
|
||||
frappe.realtime.on('data_import_progress', data => {
|
||||
if (data.data_import !== frm.doc.name) {
|
||||
return;
|
||||
}
|
||||
let percent = Math.floor((data.current * 100) / data.total);
|
||||
let seconds = Math.floor(data.eta);
|
||||
let minutes = Math.floor(data.eta / 60);
|
||||
let eta_message =
|
||||
seconds < 60
|
||||
? __('About {0} seconds remaining', [seconds])
|
||||
: minutes === 1
|
||||
? __('About {0} minute remaining', [minutes])
|
||||
: __('About {0} minutes remaining', [minutes]);
|
||||
|
||||
let message;
|
||||
if (data.success) {
|
||||
let message_args = [data.current, data.total, eta_message];
|
||||
message =
|
||||
frm.doc.import_type === 'Insert New Records'
|
||||
? __('Importing {0} of {1}, {2}', message_args)
|
||||
: __('Updating {0} of {1}, {2}', message_args);
|
||||
}
|
||||
if (data.skipping) {
|
||||
message = __('Skipping {0} of {1}, {2}', [data.current, data.total, eta_message]);
|
||||
}
|
||||
frm.dashboard.show_progress(__('Import Progress'), percent, message);
|
||||
frm.page.set_indicator(__('In Progress'), 'orange');
|
||||
|
||||
// hide progress when complete
|
||||
if (data.current === data.total) {
|
||||
setTimeout(() => {
|
||||
frm.dashboard.hide();
|
||||
frm.refresh();
|
||||
}, 2000);
|
||||
}
|
||||
});
|
||||
|
||||
frm.set_query('reference_doctype', () => {
|
||||
return {
|
||||
filters: {
|
||||
allow_import: 1
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
frm.get_field('import_file').df.options = {
|
||||
restrictions: {
|
||||
allowed_file_types: ['.csv', '.xls', '.xlsx']
|
||||
}
|
||||
};
|
||||
},
|
||||
|
||||
refresh(frm) {
|
||||
frm.page.hide_icon_group();
|
||||
frm.trigger('import_file');
|
||||
frm.trigger('show_import_log');
|
||||
frm.trigger('show_import_warnings');
|
||||
frm.trigger('toggle_submit_after_import');
|
||||
frm.trigger('show_import_status');
|
||||
|
||||
if (frm.doc.status === 'Partial Success') {
|
||||
frm.add_custom_button(__('Export Errored Rows'),
|
||||
() => frm.trigger('export_errored_rows'));
|
||||
}
|
||||
|
||||
if (frm.doc.status.includes('Success')) {
|
||||
frm.add_custom_button(__('Go to {0} List', [frm.doc.reference_doctype]),
|
||||
() => frappe.set_route('List', frm.doc.reference_doctype));
|
||||
}
|
||||
|
||||
if (frm.doc.status !== 'Success') {
|
||||
if (!frm.is_new() && frm.doc.import_file) {
|
||||
let label = frm.doc.status === 'Pending' ? __('Start Import') : __('Retry');
|
||||
frm.page.set_primary_action(label, () => frm.events.start_import(frm));
|
||||
} else {
|
||||
frm.page.set_primary_action(__('Save'), () => frm.save());
|
||||
}
|
||||
}
|
||||
},
|
||||
show_import_status(frm) {
|
||||
let import_log = JSON.parse(frm.doc.import_log || '[]');
|
||||
let successful_records = import_log.filter(log => log.success);
|
||||
let failed_records = import_log.filter(log => !log.success);
|
||||
if (successful_records.length === 0) return;
|
||||
|
||||
let message;
|
||||
if (failed_records.length === 0) {
|
||||
let message_args = [successful_records.length];
|
||||
if (frm.doc.import_type === 'Insert New Records') {
|
||||
message =
|
||||
successful_records.length > 1
|
||||
? __('Successfully imported {0} records.', message_args)
|
||||
: __('Successfully imported {0} record.', message_args);
|
||||
} else {
|
||||
message =
|
||||
successful_records.length > 1
|
||||
? __('Successfully updated {0} records.', message_args)
|
||||
: __('Successfully updated {0} record.', message_args);
|
||||
}
|
||||
} else {
|
||||
let message_args = [successful_records.length, import_log.length];
|
||||
if (frm.doc.import_type === 'Insert New Records') {
|
||||
message =
|
||||
successful_records.length > 1
|
||||
? __('Successfully imported {0} records out of {1}.', message_args)
|
||||
: __('Successfully imported {0} record out of {1}.', message_args);
|
||||
} else {
|
||||
message =
|
||||
successful_records.length > 1
|
||||
? __('Successfully updated {0} records out of {1}.', message_args)
|
||||
: __('Successfully updated {0} record out of {1}.', message_args);
|
||||
}
|
||||
}
|
||||
frm.dashboard.set_headline(message);
|
||||
},
|
||||
|
||||
start_import(frm) {
|
||||
frm.call({
|
||||
doc: frm.doc,
|
||||
method: 'start_import',
|
||||
btn: frm.page.btn_primary
|
||||
});
|
||||
},
|
||||
|
||||
download_template(frm) {
|
||||
if (frm.data_exporter && frm.data_exporter.doctype === frm.doc.reference_doctype) {
|
||||
frm.data_exporter.dialog.show();
|
||||
set_export_records();
|
||||
} else {
|
||||
frappe.require('/assets/js/data_import_tools.min.js', () => {
|
||||
frm.data_exporter = new frappe.data_import.DataExporter(frm.doc.reference_doctype);
|
||||
set_export_records();
|
||||
});
|
||||
}
|
||||
|
||||
function set_export_records() {
|
||||
if (frm.doc.import_type === 'Insert New Records') {
|
||||
frm.data_exporter.dialog.set_value('export_records', 'blank_template');
|
||||
} else {
|
||||
frm.data_exporter.dialog.set_value('export_records', 'all');
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
reference_doctype(frm) {
|
||||
frm.trigger('toggle_submit_after_import');
|
||||
},
|
||||
|
||||
toggle_submit_after_import(frm) {
|
||||
frm.toggle_display('submit_after_import', false);
|
||||
let doctype = frm.doc.reference_doctype;
|
||||
if (doctype) {
|
||||
frappe.model.with_doctype(doctype, () => {
|
||||
let meta = frappe.get_meta(doctype);
|
||||
frm.toggle_display('submit_after_import', meta.is_submittable);
|
||||
});
|
||||
}
|
||||
},
|
||||
|
||||
import_file(frm) {
|
||||
frm.toggle_display('section_import_preview', frm.doc.import_file);
|
||||
if (!frm.doc.import_file) {
|
||||
frm.get_field('import_preview').$wrapper.empty();
|
||||
return;
|
||||
}
|
||||
|
||||
// load import preview
|
||||
frm.get_field('import_preview').$wrapper.empty();
|
||||
$('<span class="text-muted">')
|
||||
.html(__('Loading import file...'))
|
||||
.appendTo(frm.get_field('import_preview').$wrapper);
|
||||
|
||||
frm
|
||||
.call({
|
||||
doc: frm.doc,
|
||||
method: 'get_preview_from_template',
|
||||
error_handlers: {
|
||||
TimestampMismatchError() {
|
||||
// ignore this error
|
||||
}
|
||||
}
|
||||
})
|
||||
.then(r => {
|
||||
let preview_data = r.message;
|
||||
frm.events.show_import_preview(frm, preview_data);
|
||||
frm.events.show_import_warnings(frm, preview_data);
|
||||
});
|
||||
},
|
||||
|
||||
show_import_preview(frm, preview_data) {
|
||||
let import_log = JSON.parse(frm.doc.import_log || '[]');
|
||||
|
||||
if (frm.import_preview && frm.import_preview.doctype === frm.doc.reference_doctype) {
|
||||
frm.import_preview.preview_data = preview_data;
|
||||
frm.import_preview.import_log = import_log;
|
||||
frm.import_preview.refresh();
|
||||
return;
|
||||
}
|
||||
|
||||
frappe.require('/assets/js/data_import_tools.min.js', () => {
|
||||
frm.import_preview = new frappe.data_import.ImportPreview({
|
||||
wrapper: frm.get_field('import_preview').$wrapper,
|
||||
doctype: frm.doc.reference_doctype,
|
||||
preview_data,
|
||||
import_log,
|
||||
frm,
|
||||
events: {
|
||||
remap_column(changed_map) {
|
||||
let template_options = JSON.parse(frm.doc.template_options || '{}');
|
||||
template_options.remap_column = template_options.remap_column || {};
|
||||
Object.assign(template_options.remap_column, changed_map);
|
||||
frm.set_value('template_options', JSON.stringify(template_options));
|
||||
frm.save().then(() => frm.trigger('import_file'));
|
||||
},
|
||||
}
|
||||
});
|
||||
});
|
||||
},
|
||||
|
||||
export_errored_rows(frm) {
|
||||
open_url_post('/api/method/frappe.core.doctype.data_import_beta.data_import_beta.download_errored_template', {
|
||||
data_import_name: frm.doc.name
|
||||
});
|
||||
},
|
||||
|
||||
show_import_warnings(frm, preview_data) {
|
||||
let warnings = JSON.parse(frm.doc.template_warnings || '[]');
|
||||
warnings = warnings.concat(preview_data.warnings || []);
|
||||
|
||||
frm.toggle_display('import_warnings_section', warnings.length > 0);
|
||||
if (warnings.length === 0) {
|
||||
frm.get_field('import_warnings').$wrapper.html('');
|
||||
return;
|
||||
}
|
||||
|
||||
// group warnings by row
|
||||
let warnings_by_row = {};
|
||||
let other_warnings = [];
|
||||
for (let warning of warnings) {
|
||||
if (warning.row) {
|
||||
warnings_by_row[warning.row] = warnings_by_row[warning.row] || [];
|
||||
warnings_by_row[warning.row].push(warning);
|
||||
} else {
|
||||
other_warnings.push(warning);
|
||||
}
|
||||
}
|
||||
|
||||
let html = '';
|
||||
html += Object.keys(warnings_by_row).map(row_number => {
|
||||
let message = warnings_by_row[row_number]
|
||||
.map(w => {
|
||||
if (w.field) {
|
||||
return `<li>${w.field.label}: ${w.message}</li>`;
|
||||
}
|
||||
return `<li>${w.message}</li>`;
|
||||
})
|
||||
.join('');
|
||||
return `
|
||||
<div class="alert border" data-row="${row_number}">
|
||||
<div class="uppercase">${__('Row {0}', [row_number])}</div>
|
||||
<div class="body"><ul>${message}</ul></div>
|
||||
</div>
|
||||
`;
|
||||
}).join('');
|
||||
|
||||
html += other_warnings
|
||||
.map(warning => {
|
||||
let header = '';
|
||||
if (warning.col) {
|
||||
header = __('Column {0}', [warning.col]);
|
||||
}
|
||||
return `
|
||||
<div class="alert border" data-col="${warning.col}">
|
||||
<div class="uppercase">${header}</div>
|
||||
<div class="body">${warning.message}</div>
|
||||
</div>
|
||||
`;
|
||||
})
|
||||
.join('');
|
||||
frm.get_field('import_warnings').$wrapper.html(`
|
||||
<div class="row">
|
||||
<div class="col-sm-6 warnings text-muted">${html}</div>
|
||||
</div>
|
||||
`);
|
||||
},
|
||||
|
||||
show_import_log(frm) {
|
||||
let import_log = JSON.parse(frm.doc.import_log || '[]');
|
||||
let logs = import_log;
|
||||
frm.toggle_display('import_log', false);
|
||||
frm.toggle_display('import_log_section', logs.length > 0);
|
||||
|
||||
if (logs.length === 0) {
|
||||
frm.get_field('import_log_preview').$wrapper.empty();
|
||||
return;
|
||||
}
|
||||
|
||||
let rows = logs
|
||||
.map(log => {
|
||||
let html;
|
||||
if (log.success) {
|
||||
html = __('Successfully imported {0}', [
|
||||
`<span class="underline">${frappe.utils.get_form_link(
|
||||
frm.doc.reference_doctype,
|
||||
log.docname,
|
||||
true
|
||||
)}<span>`
|
||||
]);
|
||||
} else {
|
||||
let messages = log.messages
|
||||
.map(JSON.parse)
|
||||
.map(m => {
|
||||
let title = m.title ? `<strong>${m.title}</strong>` : '';
|
||||
let message = m.message ? `<div>${m.message}</div>` : '';
|
||||
return title + message;
|
||||
})
|
||||
.join('');
|
||||
let id = frappe.dom.get_unique_id();
|
||||
html = `${messages}
|
||||
<button class="btn btn-default btn-xs margin-top" type="button" data-toggle="collapse" data-target="#${id}" aria-expanded="false" aria-controls="${id}">
|
||||
${__('Show Traceback')}
|
||||
</button>
|
||||
<div class="collapse margin-top" id="${id}">
|
||||
<div class="well">
|
||||
<pre>${log.exception}</pre>
|
||||
</div>
|
||||
</div>`;
|
||||
}
|
||||
let indicator_color = log.success ? 'green' : 'red';
|
||||
let title = log.success ? __('Success') : __('Failure');
|
||||
return `<tr>
|
||||
<td>${log.row_indexes.join(', ')}</td>
|
||||
<td>
|
||||
<div class="indicator ${indicator_color}">${title}</div>
|
||||
</td>
|
||||
<td>
|
||||
${html}
|
||||
</td>
|
||||
</tr>`;
|
||||
})
|
||||
.join('');
|
||||
|
||||
frm.get_field('import_log_preview').$wrapper.html(`
|
||||
<table class="table table-bordered">
|
||||
<tr class="text-muted">
|
||||
<th width="10%">${__('Row Number')}</th>
|
||||
<th width="10%">${__('Status')}</th>
|
||||
<th width="80%">${__('Message')}</th>
|
||||
</tr>
|
||||
${rows}
|
||||
</table>
|
||||
`);
|
||||
},
|
||||
|
||||
show_missing_link_values(frm, missing_link_values) {
|
||||
let can_be_created_automatically = missing_link_values.every(
|
||||
d => d.has_one_mandatory_field
|
||||
);
|
||||
|
||||
let html = missing_link_values
|
||||
.map(d => {
|
||||
let doctype = d.doctype;
|
||||
let values = d.missing_values;
|
||||
return `
|
||||
<h5>${doctype}</h5>
|
||||
<ul>${values.map(v => `<li>${v}</li>`).join('')}</ul>
|
||||
`;
|
||||
})
|
||||
.join('');
|
||||
|
||||
if (can_be_created_automatically) {
|
||||
let message = __('There are some linked records which needs to be created before we can import your file. Do you want to create the following missing records automatically?');
|
||||
frappe.confirm(message + html, () => {
|
||||
frm
|
||||
.call('create_missing_link_values', {
|
||||
missing_link_values
|
||||
})
|
||||
.then(r => {
|
||||
let records = r.message;
|
||||
frappe.msgprint(
|
||||
__('Created {0} records successfully.', [records.length])
|
||||
);
|
||||
});
|
||||
});
|
||||
} else {
|
||||
frappe.msgprint(
|
||||
__('The following records needs to be created before we can import your file.') + html
|
||||
);
|
||||
}
|
||||
}
|
||||
});
|
||||
167
frappe/core/doctype/data_import_beta/data_import_beta.json
Normal file
167
frappe/core/doctype/data_import_beta/data_import_beta.json
Normal file
|
|
@ -0,0 +1,167 @@
|
|||
{
|
||||
"autoname": "format:{reference_doctype} Import on {creation}",
|
||||
"beta": 1,
|
||||
"creation": "2019-08-04 14:16:08.318714",
|
||||
"doctype": "DocType",
|
||||
"editable_grid": 1,
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"reference_doctype",
|
||||
"import_type",
|
||||
"download_template",
|
||||
"import_file",
|
||||
"column_break_5",
|
||||
"status",
|
||||
"section_break_7",
|
||||
"submit_after_import",
|
||||
"mute_emails",
|
||||
"template_options",
|
||||
"section_import_preview",
|
||||
"import_preview",
|
||||
"import_warnings_section",
|
||||
"template_warnings",
|
||||
"import_warnings",
|
||||
"import_log_section",
|
||||
"import_log",
|
||||
"import_log_preview"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "reference_doctype",
|
||||
"fieldtype": "Link",
|
||||
"in_list_view": 1,
|
||||
"label": "Document Type",
|
||||
"options": "DocType",
|
||||
"reqd": 1,
|
||||
"set_only_once": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "import_type",
|
||||
"fieldtype": "Select",
|
||||
"in_list_view": 1,
|
||||
"label": "Import Type",
|
||||
"options": "\nInsert New Records\nUpdate Existing Records",
|
||||
"reqd": 1,
|
||||
"set_only_once": 1
|
||||
},
|
||||
{
|
||||
"depends_on": "eval:!doc.__islocal",
|
||||
"fieldname": "import_file",
|
||||
"fieldtype": "Attach",
|
||||
"in_list_view": 1,
|
||||
"label": "Import File"
|
||||
},
|
||||
{
|
||||
"fieldname": "import_preview",
|
||||
"fieldtype": "HTML",
|
||||
"label": "Import Preview"
|
||||
},
|
||||
{
|
||||
"fieldname": "section_import_preview",
|
||||
"fieldtype": "Section Break",
|
||||
"label": "Preview"
|
||||
},
|
||||
{
|
||||
"fieldname": "column_break_5",
|
||||
"fieldtype": "Column Break"
|
||||
},
|
||||
{
|
||||
"collapsible": 1,
|
||||
"depends_on": "eval:!doc.__islocal",
|
||||
"fieldname": "section_break_7",
|
||||
"fieldtype": "Section Break",
|
||||
"label": "Import Options"
|
||||
},
|
||||
{
|
||||
"fieldname": "template_options",
|
||||
"fieldtype": "Code",
|
||||
"hidden": 1,
|
||||
"label": "Template Options",
|
||||
"options": "JSON",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "import_log",
|
||||
"fieldtype": "Code",
|
||||
"label": "Import Log",
|
||||
"options": "JSON"
|
||||
},
|
||||
{
|
||||
"fieldname": "import_log_section",
|
||||
"fieldtype": "Section Break",
|
||||
"label": "Import Log"
|
||||
},
|
||||
{
|
||||
"fieldname": "import_log_preview",
|
||||
"fieldtype": "HTML",
|
||||
"label": "Import Log Preview"
|
||||
},
|
||||
{
|
||||
"default": "Pending",
|
||||
"fieldname": "status",
|
||||
"fieldtype": "Select",
|
||||
"hidden": 1,
|
||||
"label": "Status",
|
||||
"options": "Pending\nSuccess\nPartial Success",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "template_warnings",
|
||||
"fieldtype": "Code",
|
||||
"hidden": 1,
|
||||
"label": "Template Warnings",
|
||||
"options": "JSON"
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "submit_after_import",
|
||||
"fieldtype": "Check",
|
||||
"label": "Submit After Import"
|
||||
},
|
||||
{
|
||||
"fieldname": "import_warnings_section",
|
||||
"fieldtype": "Section Break",
|
||||
"label": "Warnings"
|
||||
},
|
||||
{
|
||||
"fieldname": "import_warnings",
|
||||
"fieldtype": "HTML",
|
||||
"label": "Import Warnings"
|
||||
},
|
||||
{
|
||||
"depends_on": "reference_doctype",
|
||||
"fieldname": "download_template",
|
||||
"fieldtype": "Button",
|
||||
"label": "Download Template"
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "mute_emails",
|
||||
"fieldtype": "Check",
|
||||
"label": "Don't Send Emails"
|
||||
}
|
||||
],
|
||||
"hide_toolbar": 1,
|
||||
"modified": "2019-09-28 13:54:35.061730",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Data Import Beta",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 1,
|
||||
"export": 1,
|
||||
"print": 1,
|
||||
"read": 1,
|
||||
"report": 1,
|
||||
"role": "System Manager",
|
||||
"share": 1,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1
|
||||
}
|
||||
99
frappe/core/doctype/data_import_beta/data_import_beta.py
Normal file
99
frappe/core/doctype/data_import_beta/data_import_beta.py
Normal file
|
|
@ -0,0 +1,99 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
from frappe.model.document import Document
|
||||
from frappe.core.doctype.data_import.importer_new import Importer
|
||||
from frappe.core.doctype.data_import.exporter_new import Exporter
|
||||
from frappe.core.page.background_jobs.background_jobs import get_info
|
||||
from frappe.utils.background_jobs import enqueue
|
||||
from frappe import _
|
||||
|
||||
|
||||
class DataImportBeta(Document):
|
||||
def validate(self):
|
||||
doc_before_save = self.get_doc_before_save()
|
||||
if not self.import_file or (
|
||||
doc_before_save and doc_before_save.import_file != self.import_file
|
||||
):
|
||||
self.template_options = ""
|
||||
self.template_warnings = ""
|
||||
|
||||
if self.import_file:
|
||||
# validate template
|
||||
self.get_importer()
|
||||
|
||||
def get_preview_from_template(self):
|
||||
if not self.import_file:
|
||||
return
|
||||
|
||||
i = self.get_importer()
|
||||
return i.get_data_for_import_preview()
|
||||
|
||||
def start_import(self):
|
||||
if frappe.utils.scheduler.is_scheduler_inactive():
|
||||
frappe.throw(
|
||||
_("Scheduler is inactive. Cannot import data."), title=_("Scheduler Inactive")
|
||||
)
|
||||
|
||||
enqueued_jobs = [d.get("job_name") for d in get_info()]
|
||||
|
||||
if self.name not in enqueued_jobs:
|
||||
enqueue(
|
||||
start_import,
|
||||
queue="default",
|
||||
timeout=6000,
|
||||
event="data_import",
|
||||
job_name=self.name,
|
||||
data_import=self.name,
|
||||
now=frappe.conf.developer_mode or frappe.flags.in_test,
|
||||
)
|
||||
|
||||
def export_errored_rows(self):
|
||||
return self.get_importer().export_errored_rows()
|
||||
|
||||
def get_importer(self):
|
||||
return Importer(self.reference_doctype, data_import=self)
|
||||
|
||||
|
||||
def start_import(data_import):
|
||||
"""This method runs in background job"""
|
||||
data_import = frappe.get_doc("Data Import Beta", data_import)
|
||||
i = Importer(data_import.reference_doctype, data_import=data_import)
|
||||
return i.import_data()
|
||||
|
||||
|
||||
@frappe.whitelist()
|
||||
def download_template(
|
||||
doctype, export_fields=None, export_records=None, export_filters=None, file_type="CSV"
|
||||
):
|
||||
"""
|
||||
Download template from Exporter
|
||||
:param doctype: Document Type
|
||||
:param export_fields=None: Fields to export as dict {'Sales Invoice': ['name', 'customer'], 'Sales Invoice Item': ['item_code']}
|
||||
:param export_records=None: One of 'all', 'by_filter', 'blank_template'
|
||||
:param export_filters: Filter dict
|
||||
:param file_type: File type to export into
|
||||
"""
|
||||
|
||||
export_fields = frappe.parse_json(export_fields)
|
||||
export_filters = frappe.parse_json(export_filters)
|
||||
export_data = export_records != "blank_template"
|
||||
|
||||
e = Exporter(
|
||||
doctype,
|
||||
export_fields=export_fields,
|
||||
export_data=export_data,
|
||||
export_filters=export_filters,
|
||||
file_type=file_type,
|
||||
export_page_length=5 if export_records == "5_records" else None,
|
||||
)
|
||||
e.build_response()
|
||||
|
||||
|
||||
@frappe.whitelist()
|
||||
def download_errored_template(data_import_name):
|
||||
data_import = frappe.get_doc("Data Import Beta", data_import_name)
|
||||
data_import.export_errored_rows()
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
frappe.listview_settings['Data Import Beta'] = {
|
||||
get_indicator: function(doc) {
|
||||
var colors = {
|
||||
"Pending": "orange",
|
||||
"Partial Success": "orange",
|
||||
"Success": "green",
|
||||
};
|
||||
return [__(doc.status), colors[doc.status], "status,=," + doc.status];
|
||||
},
|
||||
formatters: {
|
||||
import_type(value) {
|
||||
return {
|
||||
'Insert New Records': __('Insert'),
|
||||
'Update Existing Records': __('Update')
|
||||
}[value];
|
||||
}
|
||||
},
|
||||
hide_name_column: true
|
||||
};
|
||||
|
|
@ -26,3 +26,8 @@ class DocField(Document):
|
|||
}, 'options')
|
||||
|
||||
return link_doctype
|
||||
|
||||
def get_select_options(self):
|
||||
if self.fieldtype == 'Select':
|
||||
options = self.options or ''
|
||||
return [d for d in options.split('\n') if d]
|
||||
|
|
|
|||
|
|
@ -312,7 +312,6 @@ class DocType(Document):
|
|||
|
||||
clear_linked_doctype_cache()
|
||||
|
||||
|
||||
def delete_duplicate_custom_fields(self):
|
||||
if not (frappe.db.table_exists(self.name) and frappe.db.table_exists("Custom Field")):
|
||||
return
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@
|
|||
"fieldname": "link_doctype",
|
||||
"fieldtype": "Link",
|
||||
"in_list_view": 1,
|
||||
"label": "Link DocType",
|
||||
"label": "Link Document Type",
|
||||
"options": "DocType",
|
||||
"reqd": 1
|
||||
},
|
||||
|
|
@ -34,7 +34,7 @@
|
|||
}
|
||||
],
|
||||
"istable": 1,
|
||||
"modified": "2019-05-16 19:54:31.400026",
|
||||
"modified": "2019-10-10 22:05:54.736093",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Dynamic Link",
|
||||
|
|
|
|||
|
|
@ -87,7 +87,7 @@ def create_json_gz_file(data, dt, dn):
|
|||
"attached_to_name": dn,
|
||||
"content": compressed_content
|
||||
})
|
||||
_file.save()
|
||||
_file.save(ignore_permissions=True)
|
||||
|
||||
|
||||
@frappe.whitelist()
|
||||
|
|
|
|||
|
|
@ -1,58 +0,0 @@
|
|||
{
|
||||
"allow_copy": 0,
|
||||
"allow_import": 0,
|
||||
"allow_rename": 0,
|
||||
"autoname": "",
|
||||
"creation": "2016-05-25 09:43:44.767581",
|
||||
"custom": 0,
|
||||
"docstatus": 0,
|
||||
"doctype": "DocType",
|
||||
"document_type": "",
|
||||
"fields": [
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"fieldname": "tag_name",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 1,
|
||||
"label": "Tags",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
}
|
||||
],
|
||||
"hide_heading": 0,
|
||||
"hide_toolbar": 0,
|
||||
"idx": 0,
|
||||
"in_create": 0,
|
||||
|
||||
"is_submittable": 0,
|
||||
"issingle": 0,
|
||||
"istable": 1,
|
||||
"max_attachments": 0,
|
||||
"modified": "2016-05-31 08:29:01.773065",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Tag",
|
||||
"name_case": "",
|
||||
"owner": "Administrator",
|
||||
"permissions": [],
|
||||
"read_only": 0,
|
||||
"read_only_onload": 0,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC"
|
||||
}
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
// Copyright (c) 2016, Frappe Technologies and contributors
|
||||
// For license information, please see license.txt
|
||||
frappe.ui.form.on('Tag', {
|
||||
tag_name:function(frm){
|
||||
for (var i = 0 ;i<frm.doc.tags.length;i++){
|
||||
frm.doc.tags[i].tag_name = toTitle(frm.doc.tags[i].tag_name)
|
||||
}
|
||||
}
|
||||
});
|
||||
|
|
@ -1,147 +0,0 @@
|
|||
{
|
||||
"allow_copy": 0,
|
||||
"allow_import": 1,
|
||||
"allow_rename": 0,
|
||||
"autoname": "field:category_name",
|
||||
"beta": 0,
|
||||
"creation": "2016-05-25 09:49:07.125394",
|
||||
"custom": 0,
|
||||
"docstatus": 0,
|
||||
"doctype": "DocType",
|
||||
"document_type": "",
|
||||
"editable_grid": 0,
|
||||
"fields": [
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "category_name",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Category Name",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "tags",
|
||||
"fieldtype": "Table",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Tags",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"options": "Tag",
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "tagdocs",
|
||||
"fieldtype": "Table",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Doctypes",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"options": "Tag Doc Category",
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
}
|
||||
],
|
||||
"hide_heading": 0,
|
||||
"hide_toolbar": 0,
|
||||
"idx": 0,
|
||||
"image_view": 0,
|
||||
"in_create": 0,
|
||||
|
||||
"is_submittable": 0,
|
||||
"issingle": 0,
|
||||
"istable": 0,
|
||||
"max_attachments": 0,
|
||||
"modified": "2016-12-29 14:40:37.489085",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Tag Category",
|
||||
"name_case": "Title Case",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"amend": 0,
|
||||
"apply_user_permissions": 0,
|
||||
"cancel": 0,
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 0,
|
||||
"export": 1,
|
||||
"if_owner": 0,
|
||||
"import": 1,
|
||||
"is_custom": 0,
|
||||
"permlevel": 0,
|
||||
"print": 0,
|
||||
"read": 1,
|
||||
"report": 0,
|
||||
"role": "System Manager",
|
||||
"set_user_permissions": 0,
|
||||
"share": 0,
|
||||
"submit": 0,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"quick_entry": 0,
|
||||
"read_only": 0,
|
||||
"read_only_onload": 0,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1,
|
||||
"track_seen": 0
|
||||
}
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2015, Frappe Technologies and Contributors
|
||||
# See license.txt
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import frappe
|
||||
import unittest
|
||||
|
||||
# test_records = frappe.get_test_records('Tag Categories')
|
||||
|
||||
class TestTagCategories(unittest.TestCase):
|
||||
pass
|
||||
|
|
@ -1,58 +0,0 @@
|
|||
{
|
||||
"allow_copy": 0,
|
||||
"allow_import": 0,
|
||||
"allow_rename": 0,
|
||||
"creation": "2016-05-25 13:09:20.996154",
|
||||
"custom": 0,
|
||||
"docstatus": 0,
|
||||
"doctype": "DocType",
|
||||
"document_type": "",
|
||||
"fields": [
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"fieldname": "tagdoc",
|
||||
"fieldtype": "Link",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 1,
|
||||
"label": "Doctype to Assign Tags",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"options": "DocType",
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
}
|
||||
],
|
||||
"hide_heading": 0,
|
||||
"hide_toolbar": 0,
|
||||
"idx": 0,
|
||||
"in_create": 0,
|
||||
|
||||
"is_submittable": 0,
|
||||
"issingle": 0,
|
||||
"istable": 1,
|
||||
"max_attachments": 0,
|
||||
"modified": "2016-05-30 15:04:45.454688",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "Tag Doc Category",
|
||||
"name_case": "",
|
||||
"owner": "Administrator",
|
||||
"permissions": [],
|
||||
"read_only": 0,
|
||||
"read_only_onload": 0,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC"
|
||||
}
|
||||
|
|
@ -38,7 +38,6 @@
|
|||
"mute_sounds",
|
||||
"change_password",
|
||||
"new_password",
|
||||
"send_password_update_notification",
|
||||
"logout_all_sessions",
|
||||
"reset_password_key",
|
||||
"last_password_reset_date",
|
||||
|
|
@ -299,13 +298,6 @@
|
|||
"label": "Set New Password",
|
||||
"no_copy": 1
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"depends_on": "eval:!doc.__islocal",
|
||||
"fieldname": "send_password_update_notification",
|
||||
"fieldtype": "Check",
|
||||
"label": "Send Password Update Notification"
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "logout_all_sessions",
|
||||
|
|
@ -593,7 +585,7 @@
|
|||
"idx": 413,
|
||||
"image_field": "user_image",
|
||||
"max_attachments": 5,
|
||||
"modified": "2019-08-09 10:34:56.912283",
|
||||
"modified": "2019-09-18 14:14:01.233124",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Core",
|
||||
"name": "User",
|
||||
|
|
|
|||
|
|
@ -153,10 +153,6 @@ class User(Document):
|
|||
if new_password and not self.flags.in_insert:
|
||||
_update_password(user=self.name, pwd=new_password, logout_all_sessions=self.logout_all_sessions)
|
||||
|
||||
if self.send_password_update_notification and self.enabled:
|
||||
self.password_update_mail(new_password)
|
||||
frappe.msgprint(_("New password emailed"))
|
||||
|
||||
def set_system_user(self):
|
||||
'''Set as System User if any of the given roles has desk_access'''
|
||||
if self.has_desk_access() or self.name == 'Administrator':
|
||||
|
|
@ -250,10 +246,6 @@ class User(Document):
|
|||
self.send_login_mail(_("Password Reset"),
|
||||
"password_reset", {"link": link}, now=True)
|
||||
|
||||
def password_update_mail(self, password):
|
||||
self.send_login_mail(_("Password Update"),
|
||||
"password_update", {"new_password": password}, now=True)
|
||||
|
||||
def send_welcome_mail_to_user(self):
|
||||
from frappe.utils import get_url
|
||||
link = self.reset_password()
|
||||
|
|
@ -502,10 +494,7 @@ class User(Document):
|
|||
if not self.restrict_ip:
|
||||
return
|
||||
|
||||
ip_list = self.restrict_ip.replace(",", "\n").split('\n')
|
||||
ip_list = [i.strip() for i in ip_list]
|
||||
|
||||
return ip_list
|
||||
return [i.strip() for i in self.restrict_ip.split(",")]
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_timezones():
|
||||
|
|
@ -1035,9 +1024,10 @@ def update_roles(role_profile):
|
|||
user.add_roles(*roles)
|
||||
|
||||
def create_contact(user, ignore_links=False, ignore_mandatory=False):
|
||||
from frappe.contacts.doctype.contact.contact import get_contact_name
|
||||
if user.name in ["Administrator", "Guest"]: return
|
||||
|
||||
if not frappe.db.get_value("Contact", {"email_id": user.email}):
|
||||
if not get_contact_name(user.email):
|
||||
contact = frappe.get_doc({
|
||||
"doctype": "Contact",
|
||||
"first_name": user.first_name,
|
||||
|
|
@ -1047,16 +1037,15 @@ def create_contact(user, ignore_links=False, ignore_mandatory=False):
|
|||
})
|
||||
|
||||
if user.email:
|
||||
contact.add_email(user.email)
|
||||
contact.add_email(user.email, is_primary=True)
|
||||
|
||||
if user.phone:
|
||||
contact.add_phone(user.phone)
|
||||
contact.add_phone(user.phone, is_primary_phone=True)
|
||||
|
||||
if user.mobile_no:
|
||||
contact.add_phone(user.mobile_no)
|
||||
contact.add_phone(user.mobile_no, is_primary_mobile_no=True)
|
||||
contact.insert(ignore_permissions=True, ignore_links=ignore_links, ignore_mandatory=ignore_mandatory)
|
||||
|
||||
|
||||
@frappe.whitelist()
|
||||
def generate_keys(user):
|
||||
"""
|
||||
|
|
|
|||
|
|
@ -66,7 +66,7 @@ def get_user_permissions(user=None):
|
|||
if not user:
|
||||
user = frappe.session.user
|
||||
|
||||
if user == "Administrator":
|
||||
if not user or user == "Administrator":
|
||||
return {}
|
||||
|
||||
cached_user_permissions = frappe.cache().hget("user_permissions", user)
|
||||
|
|
@ -143,7 +143,11 @@ def get_applicable_for_doctype_list(doctype, txt, searchfield, start, page_len,
|
|||
return return_list
|
||||
|
||||
def get_permitted_documents(doctype):
|
||||
return [d.get('doc') for d in get_user_permissions().get(doctype, []) \
|
||||
''' Returns permitted documents from the given doctype for the session user '''
|
||||
# sort permissions in a way to make the first permission in the list to be default
|
||||
user_perm_list = sorted(get_user_permissions().get(doctype, []), key=lambda x: x.get('is_default'), reverse=True)
|
||||
|
||||
return [d.get('doc') for d in user_perm_list \
|
||||
if d.get('doc')]
|
||||
|
||||
@frappe.whitelist()
|
||||
|
|
|
|||
|
|
@ -43,7 +43,9 @@ def get_diff(old, new, for_child=False):
|
|||
if not new:
|
||||
return None
|
||||
|
||||
out = frappe._dict(changed = [], added = [], removed = [], row_changed = [])
|
||||
# capture data import if set
|
||||
data_import = new.flags.via_data_import
|
||||
out = frappe._dict(changed = [], added = [], removed = [], row_changed = [], data_import=data_import)
|
||||
for df in new.meta.fields:
|
||||
if df.fieldtype in no_value_fields and df.fieldtype not in table_fields:
|
||||
continue
|
||||
|
|
@ -91,4 +93,4 @@ def get_diff(old, new, for_child=False):
|
|||
return None
|
||||
|
||||
def on_doctype_update():
|
||||
frappe.db.add_index("Version", ["ref_doctype", "docname"])
|
||||
frappe.db.add_index("Version", ["ref_doctype", "docname"])
|
||||
|
|
|
|||
|
|
@ -230,6 +230,7 @@ class DashboardChart {
|
|||
title: this.chart_doc.chart_name,
|
||||
data: this.data,
|
||||
type: chart_type_map[this.chart_doc.type],
|
||||
truncateLegends: 1,
|
||||
colors: [this.chart_doc.color || "light-blue"],
|
||||
axisOptions: {
|
||||
xIsSeries: this.chart_doc.timeseries,
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@
|
|||
"track_changes",
|
||||
"track_views",
|
||||
"allow_auto_repeat",
|
||||
"allow_import",
|
||||
"image_view",
|
||||
"column_break_5",
|
||||
"title_field",
|
||||
|
|
@ -167,13 +168,19 @@
|
|||
"fieldname": "allow_auto_repeat",
|
||||
"fieldtype": "Check",
|
||||
"label": "Allow Auto Repeat"
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
"fieldname": "allow_import",
|
||||
"fieldtype": "Check",
|
||||
"label": "Allow Import (via Data Import Tool)"
|
||||
}
|
||||
],
|
||||
"hide_toolbar": 1,
|
||||
"icon": "fa fa-glass",
|
||||
"idx": 1,
|
||||
"issingle": 1,
|
||||
"modified": "2019-07-01 22:50:50.372465",
|
||||
"modified": "2019-10-08 11:16:36.698006",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Custom",
|
||||
"name": "Customize Form",
|
||||
|
|
|
|||
|
|
@ -30,7 +30,8 @@ doctype_properties = {
|
|||
'max_attachments': 'Int',
|
||||
'track_changes': 'Check',
|
||||
'track_views': 'Check',
|
||||
'allow_auto_repeat': 'Check'
|
||||
'allow_auto_repeat': 'Check',
|
||||
'allow_import': 'Check'
|
||||
}
|
||||
|
||||
docfield_properties = {
|
||||
|
|
|
|||
|
|
@ -607,7 +607,7 @@ class Database(object):
|
|||
"""Update multiple values. Alias for `set_value`."""
|
||||
return self.set_value(*args, **kwargs)
|
||||
|
||||
def set_value(self, dt, dn, field, val, modified=None, modified_by=None,
|
||||
def set_value(self, dt, dn, field, val=None, modified=None, modified_by=None,
|
||||
update_modified=True, debug=False):
|
||||
"""Set a single value in the database, do not call the ORM triggers
|
||||
but update the modified timestamp (unless specified not to).
|
||||
|
|
@ -845,16 +845,23 @@ class Database(object):
|
|||
|
||||
def get_db_table_columns(self, table):
|
||||
"""Returns list of column names from given table."""
|
||||
return [r[0] for r in self.sql('''
|
||||
select column_name
|
||||
from information_schema.columns
|
||||
where table_name = %s ''', table)]
|
||||
columns = frappe.cache().hget('table_columns', table)
|
||||
if columns is None:
|
||||
columns = [r[0] for r in self.sql('''
|
||||
select column_name
|
||||
from information_schema.columns
|
||||
where table_name = %s ''', table)]
|
||||
|
||||
if columns:
|
||||
frappe.cache().hset('table_columns', table, columns)
|
||||
|
||||
return columns
|
||||
|
||||
def get_table_columns(self, doctype):
|
||||
"""Returns list of column names from given doctype."""
|
||||
columns = self.get_db_table_columns('tab' + doctype)
|
||||
if not columns:
|
||||
raise self.TableMissingError
|
||||
raise self.TableMissingError('DocType', doctype)
|
||||
return columns
|
||||
|
||||
def has_column(self, doctype, column):
|
||||
|
|
@ -961,6 +968,26 @@ class Database(object):
|
|||
frappe.flags.touched_tables = set()
|
||||
frappe.flags.touched_tables.update(tables)
|
||||
|
||||
def bulk_insert(self, doctype, fields, values):
|
||||
"""
|
||||
Insert multiple records at a time
|
||||
|
||||
:param doctype: Doctype name
|
||||
:param fields: list of fields
|
||||
:params values: list of list of values
|
||||
"""
|
||||
insert_list = []
|
||||
fields = ", ".join(["`"+field+"`" for field in fields])
|
||||
|
||||
for idx, value in enumerate(values):
|
||||
insert_list.append(tuple(value))
|
||||
if idx and (idx%10000 == 0 or idx < len(values)-1):
|
||||
self.sql("""INSERT INTO `tab{doctype}` ({fields}) VALUES {values}""".format(
|
||||
doctype=doctype,
|
||||
fields=fields,
|
||||
values=", ".join(['%s'] * len(insert_list))
|
||||
), tuple(insert_list))
|
||||
insert_list = []
|
||||
|
||||
def enqueue_jobs_after_commit():
|
||||
if frappe.flags.enqueue_after_commit and len(frappe.flags.enqueue_after_commit) > 0:
|
||||
|
|
|
|||
|
|
@ -49,6 +49,12 @@ CREATE TABLE `tabDocField` (
|
|||
`default` text,
|
||||
`description` text,
|
||||
`in_list_view` int(1) NOT NULL DEFAULT 0,
|
||||
`fetch_if_empty` int(1) NOT NULL DEFAULT 0,
|
||||
`in_filter` int(1) NOT NULL DEFAULT 0,
|
||||
`remember_last_selected_value` int(1) NOT NULL DEFAULT 0,
|
||||
`ignore_xss_filter` int(1) NOT NULL DEFAULT 0,
|
||||
`print_hide_if_no_value` int(1) NOT NULL DEFAULT 0,
|
||||
`allow_bulk_edit` int(1) NOT NULL DEFAULT 0,
|
||||
`in_standard_filter` int(1) NOT NULL DEFAULT 0,
|
||||
`in_preview` int(1) NOT NULL DEFAULT 0,
|
||||
`read_only` int(1) NOT NULL DEFAULT 0,
|
||||
|
|
|
|||
|
|
@ -49,6 +49,12 @@ CREATE TABLE "tabDocField" (
|
|||
"default" text,
|
||||
"description" text,
|
||||
"in_list_view" smallint NOT NULL DEFAULT 0,
|
||||
"fetch_if_empty" smallint NOT NULL DEFAULT 0,
|
||||
"in_filter" smallint NOT NULL DEFAULT 0,
|
||||
"remember_last_selected_value" smallint NOT NULL DEFAULT 0,
|
||||
"ignore_xss_filter" smallint NOT NULL DEFAULT 0,
|
||||
"print_hide_if_no_value" smallint NOT NULL DEFAULT 0,
|
||||
"allow_bulk_edit" smallint NOT NULL DEFAULT 0,
|
||||
"in_standard_filter" smallint NOT NULL DEFAULT 0,
|
||||
"in_preview" smallint NOT NULL DEFAULT 0,
|
||||
"read_only" smallint NOT NULL DEFAULT 0,
|
||||
|
|
|
|||
|
|
@ -33,6 +33,7 @@ class DBTable:
|
|||
if self.is_new():
|
||||
self.create()
|
||||
else:
|
||||
frappe.cache().hdel('table_columns', self.table_name)
|
||||
self.alter()
|
||||
|
||||
def create(self):
|
||||
|
|
|
|||
|
|
@ -52,7 +52,7 @@ class Event(Document):
|
|||
["Communication Link", "link_doctype", "=", participant.reference_doctype],
|
||||
["Communication Link", "link_name", "=", participant.reference_docname]
|
||||
]
|
||||
comms = frappe.get_list("Communication", filters=filters, fields=["name"])
|
||||
comms = frappe.get_all("Communication", filters=filters, fields=["name"])
|
||||
|
||||
if comms:
|
||||
for comm in comms:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,29 @@
|
|||
{
|
||||
"creation": "2019-09-13 21:33:55.551941",
|
||||
"doctype": "DocType",
|
||||
"editable_grid": 1,
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"document_type"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "document_type",
|
||||
"fieldtype": "Link",
|
||||
"in_list_view": 1,
|
||||
"label": "Document Type",
|
||||
"options": "DocType"
|
||||
}
|
||||
],
|
||||
"istable": 1,
|
||||
"modified": "2019-09-18 17:59:44.354052",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Desk",
|
||||
"name": "Global Search DocType",
|
||||
"owner": "Administrator",
|
||||
"permissions": [],
|
||||
"quick_entry": 1,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1
|
||||
}
|
||||
|
|
@ -1,11 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2015, Frappe Technologies and contributors
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
# import frappe
|
||||
from frappe.model.document import Document
|
||||
|
||||
class Tag(Document):
|
||||
def validate(self):
|
||||
self.tag_name = self.tag_name.title()
|
||||
class GlobalSearchDocType(Document):
|
||||
pass
|
||||
0
frappe/desk/doctype/global_search_settings/__init__.py
Normal file
0
frappe/desk/doctype/global_search_settings/__init__.py
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
// Copyright (c) 2019, Frappe Technologies and contributors
|
||||
// For license information, please see license.txt
|
||||
|
||||
frappe.ui.form.on('Global Search Settings', {
|
||||
refresh: function(frm) {
|
||||
|
||||
frappe.realtime.on('global_search_settings', (data) => {
|
||||
if (data.progress) {
|
||||
frm.dashboard.show_progress('Setting up Global Search', data.progress / data.total * 100, data.msg);
|
||||
if (data.progress === data.total) {
|
||||
frm.dashboard.hide_progress('Setting up Global Search');
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
frm.add_custom_button(__("Reset"), function () {
|
||||
frappe.call({
|
||||
method: "frappe.desk.doctype.global_search_settings.global_search_settings.reset_global_search_settings_doctypes",
|
||||
callback: function() {
|
||||
frappe.show_alert({
|
||||
message: __("Global Search Document Types Reset."),
|
||||
indicator: "green"
|
||||
});
|
||||
frm.refresh();
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
});
|
||||
|
|
@ -0,0 +1,39 @@
|
|||
{
|
||||
"creation": "2019-09-03 16:08:21.333698",
|
||||
"doctype": "DocType",
|
||||
"editable_grid": 1,
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"allowed_in_global_search"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "allowed_in_global_search",
|
||||
"fieldtype": "Table",
|
||||
"label": "Search Priorities",
|
||||
"options": "Global Search DocType"
|
||||
}
|
||||
],
|
||||
"issingle": 1,
|
||||
"modified": "2019-10-10 22:05:02.692689",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Desk",
|
||||
"name": "Global Search Settings",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 1,
|
||||
"print": 1,
|
||||
"read": 1,
|
||||
"role": "System Manager",
|
||||
"share": 1,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"quick_entry": 1,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1
|
||||
}
|
||||
|
|
@ -0,0 +1,84 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
from frappe.model.document import Document
|
||||
from frappe import _
|
||||
|
||||
class GlobalSearchSettings(Document):
|
||||
|
||||
def validate(self):
|
||||
dts, core_dts, repeated_dts = [], [], []
|
||||
|
||||
for dt in self.allowed_in_global_search:
|
||||
if dt.document_type in dts:
|
||||
repeated_dts.append(dt.document_type)
|
||||
|
||||
if frappe.get_meta(dt.document_type).module == "Core":
|
||||
core_dts.append(dt.document_type)
|
||||
|
||||
dts.append(dt.document_type)
|
||||
|
||||
if core_dts:
|
||||
core_dts = (", ".join([frappe.bold(dt) for dt in core_dts]))
|
||||
frappe.throw(_("Core Modules {0} cannot be searched in Global Search.").format(core_dts))
|
||||
|
||||
if repeated_dts:
|
||||
repeated_dts = (", ".join([frappe.bold(dt) for dt in repeated_dts]))
|
||||
frappe.throw(_("Document Type {0} has been repeated.").format(repeated_dts))
|
||||
|
||||
def get_doctypes_for_global_search():
|
||||
doctypes = frappe.get_list("Global Search DocType", fields=["document_type"], order_by="idx ASC")
|
||||
if not doctypes:
|
||||
return []
|
||||
|
||||
return [d.document_type for d in doctypes]
|
||||
|
||||
@frappe.whitelist()
|
||||
def reset_global_search_settings_doctypes():
|
||||
update_global_search_doctypes()
|
||||
|
||||
def update_global_search_doctypes():
|
||||
global_search_doctypes = []
|
||||
show_message(1, _("Fetching default Global Search documents."))
|
||||
|
||||
installed_apps = [app for app in frappe.get_installed_apps() if app]
|
||||
active_domains = [domain for domain in frappe.get_active_domains() if domain]
|
||||
active_domains.append("Default")
|
||||
|
||||
for app in installed_apps:
|
||||
search_doctypes = frappe.get_hooks(hook="global_search_doctypes", app_name=app)
|
||||
if not search_doctypes:
|
||||
continue
|
||||
|
||||
for domain in active_domains:
|
||||
if search_doctypes.get(domain):
|
||||
global_search_doctypes.extend(search_doctypes.get(domain))
|
||||
|
||||
doctype_list = set([dt.name for dt in frappe.get_list("DocType")])
|
||||
allowed_in_global_search = []
|
||||
|
||||
for dt in global_search_doctypes:
|
||||
if dt.get("index") is not None:
|
||||
allowed_in_global_search.insert(dt.get("index"), dt.get("doctype"))
|
||||
continue
|
||||
|
||||
allowed_in_global_search.append(dt.get("doctype"))
|
||||
|
||||
show_message(2, _("Setting up Global Search documents."))
|
||||
global_search_settings = frappe.get_single("Global Search Settings")
|
||||
global_search_settings.allowed_in_global_search = []
|
||||
for dt in allowed_in_global_search:
|
||||
if dt not in doctype_list:
|
||||
continue
|
||||
|
||||
global_search_settings.append("allowed_in_global_search", {
|
||||
"document_type": dt
|
||||
})
|
||||
global_search_settings.save(ignore_permissions=True)
|
||||
show_message(3, "Global Search Documents have been reset.")
|
||||
|
||||
def show_message(progress, msg):
|
||||
frappe.publish_realtime('global_search_settings', {"progress":progress, "total":3, "msg": msg}, user=frappe.session.user)
|
||||
0
frappe/desk/doctype/tag/__init__.py
Normal file
0
frappe/desk/doctype/tag/__init__.py
Normal file
8
frappe/desk/doctype/tag/tag.js
Normal file
8
frappe/desk/doctype/tag/tag.js
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
// Copyright (c) 2019, Frappe Technologies and contributors
|
||||
// For license information, please see license.txt
|
||||
|
||||
frappe.ui.form.on('Tag', {
|
||||
// refresh: function(frm) {
|
||||
|
||||
// }
|
||||
});
|
||||
49
frappe/desk/doctype/tag/tag.json
Normal file
49
frappe/desk/doctype/tag/tag.json
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
{
|
||||
"autoname": "Prompt",
|
||||
"creation": "2016-05-25 09:43:44.767581",
|
||||
"doctype": "DocType",
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"description"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "description",
|
||||
"fieldtype": "Small Text",
|
||||
"in_list_view": 1,
|
||||
"label": "Description"
|
||||
}
|
||||
],
|
||||
"modified": "2019-09-25 17:47:41.712237",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Desk",
|
||||
"name": "Tag",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 1,
|
||||
"export": 1,
|
||||
"print": 1,
|
||||
"read": 1,
|
||||
"report": 1,
|
||||
"role": "System Manager",
|
||||
"share": 1,
|
||||
"write": 1
|
||||
},
|
||||
{
|
||||
"create": 1,
|
||||
"email": 1,
|
||||
"export": 1,
|
||||
"print": 1,
|
||||
"read": 1,
|
||||
"report": 1,
|
||||
"role": "All",
|
||||
"share": 1,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC"
|
||||
}
|
||||
179
frappe/desk/doctype/tag/tag.py
Normal file
179
frappe/desk/doctype/tag/tag.py
Normal file
|
|
@ -0,0 +1,179 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
from frappe.model.document import Document
|
||||
|
||||
class Tag(Document):
|
||||
pass
|
||||
|
||||
def check_user_tags(dt):
|
||||
"if the user does not have a tags column, then it creates one"
|
||||
try:
|
||||
frappe.db.sql("select `_user_tags` from `tab%s` limit 1" % dt)
|
||||
except Exception as e:
|
||||
if frappe.db.is_column_missing(e):
|
||||
DocTags(dt).setup()
|
||||
|
||||
@frappe.whitelist()
|
||||
def add_tag(tag, dt, dn, color=None):
|
||||
"adds a new tag to a record, and creates the Tag master"
|
||||
DocTags(dt).add(dn, tag)
|
||||
|
||||
return tag
|
||||
|
||||
@frappe.whitelist()
|
||||
def remove_tag(tag, dt, dn):
|
||||
"removes tag from the record"
|
||||
DocTags(dt).remove(dn, tag)
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_tagged_docs(doctype, tag):
|
||||
frappe.has_permission(doctype, throw=True)
|
||||
|
||||
return frappe.db.sql("""SELECT name
|
||||
FROM `tab{0}`
|
||||
WHERE _user_tags LIKE '%{1}%'""".format(doctype, tag))
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_tags(doctype, txt):
|
||||
tag = frappe.get_list("Tag", filters=[["name", "like", "%{}%".format(txt)]])
|
||||
tags = [t.name for t in tag]
|
||||
|
||||
return sorted(filter(lambda t: t and txt.lower() in t.lower(), list(set(tags))))
|
||||
|
||||
class DocTags:
|
||||
"""Tags for a particular doctype"""
|
||||
def __init__(self, dt):
|
||||
self.dt = dt
|
||||
|
||||
def get_tag_fields(self):
|
||||
"""returns tag_fields property"""
|
||||
return frappe.db.get_value('DocType', self.dt, 'tag_fields')
|
||||
|
||||
def get_tags(self, dn):
|
||||
"""returns tag for a particular item"""
|
||||
return (frappe.db.get_value(self.dt, dn, '_user_tags', ignore=1) or '').strip()
|
||||
|
||||
def add(self, dn, tag):
|
||||
"""add a new user tag"""
|
||||
tl = self.get_tags(dn).split(',')
|
||||
if not tag in tl:
|
||||
tl.append(tag)
|
||||
if not frappe.db.exists("Tag", tag):
|
||||
frappe.get_doc({"doctype": "Tag", "name": tag}).insert(ignore_permissions=True)
|
||||
self.update(dn, tl)
|
||||
|
||||
def remove(self, dn, tag):
|
||||
"""remove a user tag"""
|
||||
tl = self.get_tags(dn).split(',')
|
||||
self.update(dn, filter(lambda x:x.lower()!=tag.lower(), tl))
|
||||
|
||||
def remove_all(self, dn):
|
||||
"""remove all user tags (call before delete)"""
|
||||
self.update(dn, [])
|
||||
|
||||
def update(self, dn, tl):
|
||||
"""updates the _user_tag column in the table"""
|
||||
|
||||
if not tl:
|
||||
tags = ''
|
||||
else:
|
||||
tl = list(set(filter(lambda x: x, tl)))
|
||||
tags = ',' + ','.join(tl)
|
||||
try:
|
||||
frappe.db.sql("update `tab%s` set _user_tags=%s where name=%s" % \
|
||||
(self.dt,'%s','%s'), (tags , dn))
|
||||
doc= frappe.get_doc(self.dt, dn)
|
||||
update_tags(doc, tags)
|
||||
except Exception as e:
|
||||
if frappe.db.is_column_missing(e):
|
||||
if not tags:
|
||||
# no tags, nothing to do
|
||||
return
|
||||
|
||||
self.setup()
|
||||
self.update(dn, tl)
|
||||
else: raise
|
||||
|
||||
def setup(self):
|
||||
"""adds the _user_tags column if not exists"""
|
||||
from frappe.database.schema import add_column
|
||||
add_column(self.dt, "_user_tags", "Data")
|
||||
|
||||
def delete_tags_for_document(doc):
|
||||
"""
|
||||
Delete the Tag Link entry of a document that has
|
||||
been deleted
|
||||
:param doc: Deleted document
|
||||
"""
|
||||
if not frappe.db.table_exists("Tag Link"):
|
||||
return
|
||||
|
||||
frappe.db.sql("""DELETE FROM `tabTag Link` WHERE `document_type`=%s AND `document_name`=%s""", (doc.doctype, doc.name))
|
||||
|
||||
def update_tags(doc, tags):
|
||||
"""
|
||||
Adds tags for documents
|
||||
:param doc: Document to be added to global tags
|
||||
"""
|
||||
|
||||
new_tags = list(set([tag.strip() for tag in tags.split(",") if tag]))
|
||||
|
||||
for tag in new_tags:
|
||||
if not frappe.db.exists("Tag Link", {"parenttype": doc.doctype, "parent": doc.name, "tag": tag}):
|
||||
frappe.get_doc({
|
||||
"doctype": "Tag Link",
|
||||
"document_type": doc.doctype,
|
||||
"document_name": doc.name,
|
||||
"parenttype": doc.doctype,
|
||||
"parent": doc.name,
|
||||
"title": doc.get_title() or '',
|
||||
"tag": tag
|
||||
}).insert(ignore_permissions=True)
|
||||
|
||||
existing_tags = [tag.tag for tag in frappe.get_list("Tag Link", filters={
|
||||
"document_type": doc.doctype,
|
||||
"document_name": doc.name
|
||||
}, fields=["tag"])]
|
||||
|
||||
deleted_tags = get_deleted_tags(new_tags, existing_tags)
|
||||
|
||||
if deleted_tags:
|
||||
for tag in deleted_tags:
|
||||
delete_tag_for_document(doc.doctype, doc.name, tag)
|
||||
|
||||
def get_deleted_tags(new_tags, existing_tags):
|
||||
|
||||
return list(set(existing_tags) - set(new_tags))
|
||||
|
||||
def delete_tag_for_document(dt, dn, tag):
|
||||
frappe.db.sql("""DELETE FROM `tabTag Link` WHERE `document_type`=%s AND `document_name`=%s AND tag=%s""", (dt, dn, tag))
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_documents_for_tag(tag):
|
||||
"""
|
||||
Search for given text in Tag Link
|
||||
:param tag: tag to be searched
|
||||
"""
|
||||
# remove hastag `#` from tag
|
||||
tag = tag[1:]
|
||||
results = []
|
||||
|
||||
result = frappe.get_list("Tag Link", filters={"tag": tag}, fields=["document_type", "document_name", "title", "tag"])
|
||||
|
||||
for res in result:
|
||||
results.append({
|
||||
"doctype": res.document_type,
|
||||
"name": res.document_name,
|
||||
"content": res.title
|
||||
})
|
||||
|
||||
print(results)
|
||||
return results
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_tags_list_for_awesomebar():
|
||||
return [t.name for t in frappe.get_list("Tag")]
|
||||
10
frappe/desk/doctype/tag/test_tag.py
Normal file
10
frappe/desk/doctype/tag/test_tag.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and Contributors
|
||||
# See license.txt
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# import frappe
|
||||
import unittest
|
||||
|
||||
class TestTag(unittest.TestCase):
|
||||
pass
|
||||
0
frappe/desk/doctype/tag_link/__init__.py
Normal file
0
frappe/desk/doctype/tag_link/__init__.py
Normal file
8
frappe/desk/doctype/tag_link/tag_link.js
Normal file
8
frappe/desk/doctype/tag_link/tag_link.js
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
// Copyright (c) 2019, Frappe Technologies and contributors
|
||||
// For license information, please see license.txt
|
||||
|
||||
frappe.ui.form.on('Tag Link', {
|
||||
// refresh: function(frm) {
|
||||
|
||||
// }
|
||||
});
|
||||
70
frappe/desk/doctype/tag_link/tag_link.json
Normal file
70
frappe/desk/doctype/tag_link/tag_link.json
Normal file
|
|
@ -0,0 +1,70 @@
|
|||
{
|
||||
"creation": "2019-09-24 13:25:36.435685",
|
||||
"doctype": "DocType",
|
||||
"editable_grid": 1,
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"document_type",
|
||||
"document_name",
|
||||
"tag",
|
||||
"title"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"fieldname": "title",
|
||||
"fieldtype": "Data",
|
||||
"label": "Document Title",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "tag",
|
||||
"fieldtype": "Link",
|
||||
"in_list_view": 1,
|
||||
"in_standard_filter": 1,
|
||||
"label": "Document Tag",
|
||||
"options": "Tag",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "document_type",
|
||||
"fieldtype": "Link",
|
||||
"in_list_view": 1,
|
||||
"in_standard_filter": 1,
|
||||
"label": "Document Type",
|
||||
"options": "DocType",
|
||||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "document_name",
|
||||
"fieldtype": "Dynamic Link",
|
||||
"in_list_view": 1,
|
||||
"in_standard_filter": 1,
|
||||
"label": "Document Name",
|
||||
"options": "document_type",
|
||||
"read_only": 1
|
||||
}
|
||||
],
|
||||
"modified": "2019-10-03 16:42:35.932409",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Desk",
|
||||
"name": "Tag Link",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 1,
|
||||
"export": 1,
|
||||
"print": 1,
|
||||
"read": 1,
|
||||
"report": 1,
|
||||
"role": "System Manager",
|
||||
"share": 1,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"read_only": 1,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_changes": 1
|
||||
}
|
||||
|
|
@ -1,10 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2015, Frappe Technologies and contributors
|
||||
# Copyright (c) 2019, Frappe Technologies and contributors
|
||||
# For license information, please see license.txt
|
||||
|
||||
from __future__ import unicode_literals
|
||||
import frappe
|
||||
# import frappe
|
||||
from frappe.model.document import Document
|
||||
|
||||
class TagCategory(Document):
|
||||
class TagLink(Document):
|
||||
pass
|
||||
10
frappe/desk/doctype/tag_link/test_tag_link.py
Normal file
10
frappe/desk/doctype/tag_link/test_tag_link.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright (c) 2019, Frappe Technologies and Contributors
|
||||
# See license.txt
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# import frappe
|
||||
import unittest
|
||||
|
||||
class TestTagLink(unittest.TestCase):
|
||||
pass
|
||||
|
|
@ -100,7 +100,8 @@ def get_docinfo(doc=None, doctype=None, name=None):
|
|||
"views": get_view_logs(doc.doctype, doc.name),
|
||||
"energy_point_logs": get_point_logs(doc.doctype, doc.name),
|
||||
"milestones": get_milestones(doc.doctype, doc.name),
|
||||
"is_document_followed": is_document_followed(doc.doctype, doc.name, frappe.session.user)
|
||||
"is_document_followed": is_document_followed(doc.doctype, doc.name, frappe.session.user),
|
||||
"tags": get_tags(doc.doctype, doc.name)
|
||||
}
|
||||
|
||||
def get_milestones(doctype, name):
|
||||
|
|
@ -255,3 +256,11 @@ def get_view_logs(doctype, docname):
|
|||
if view_logs:
|
||||
logs = view_logs
|
||||
return logs
|
||||
|
||||
def get_tags(doctype, name):
|
||||
tags = [tag.tag for tag in frappe.get_all("Tag Link", filters={
|
||||
"document_type": doctype,
|
||||
"document_name": name
|
||||
}, fields=["tag"])]
|
||||
|
||||
return ",".join([tag for tag in tags])
|
||||
|
|
@ -31,7 +31,14 @@ def runserverobj(method, docs=None, dt=None, dn=None, arg=None, args=None):
|
|||
except ValueError:
|
||||
args = args
|
||||
|
||||
fnargs, varargs, varkw, defaults = inspect.getargspec(getattr(doc, method))
|
||||
try:
|
||||
fnargs, varargs, varkw, defaults = inspect.getargspec(getattr(doc, method))
|
||||
except ValueError:
|
||||
fnargs = inspect.getfullargspec(getattr(doc, method)).args
|
||||
varargs = inspect.getfullargspec(getattr(doc, method)).varargs
|
||||
varkw = inspect.getfullargspec(getattr(doc, method)).varkw
|
||||
defaults = inspect.getfullargspec(getattr(doc, method)).defaults
|
||||
|
||||
if not fnargs or (len(fnargs)==1 and fnargs[0]=="self"):
|
||||
r = doc.run_method(method)
|
||||
|
||||
|
|
|
|||
42
frappe/desk/leaderboard.py
Normal file
42
frappe/desk/leaderboard.py
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
|
||||
from __future__ import unicode_literals, print_function
|
||||
import frappe
|
||||
from frappe.utils import get_fullname
|
||||
|
||||
def get_leaderboards():
|
||||
leaderboards = {
|
||||
'User': {
|
||||
'fields': ['points'],
|
||||
'method': 'frappe.desk.leaderboard.get_energy_point_leaderboard',
|
||||
'company_disabled': 1
|
||||
}
|
||||
}
|
||||
return leaderboards
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_energy_point_leaderboard(from_date, company = None, field = None, limit = None):
|
||||
energy_point_users = frappe.db.get_all('Energy Point Log',
|
||||
fields = ['user as name', 'sum(points) as value'],
|
||||
filters = [
|
||||
['type', '!=', 'Review'],
|
||||
['creation', '>', from_date]
|
||||
],
|
||||
group_by = 'user',
|
||||
order_by = 'value desc'
|
||||
)
|
||||
all_users = frappe.db.get_all('User',
|
||||
filters = {'name': ['not in', ['Administrator', 'Guest']]},
|
||||
order_by = 'name ASC')
|
||||
|
||||
all_users_list = list(map(lambda x: x['name'], all_users))
|
||||
energy_point_users_list = list(map(lambda x: x['name'], energy_point_users))
|
||||
for user in all_users_list:
|
||||
if user not in energy_point_users_list:
|
||||
energy_point_users.append({'name': user, 'value': 0})
|
||||
|
||||
for user in energy_point_users:
|
||||
user_id = user['name']
|
||||
user['name'] = get_fullname(user['name'])
|
||||
user['formatted_name'] = '<a href="#user-profile/{}">{}</a>'.format(user_id, get_fullname(user_id))
|
||||
|
||||
return energy_point_users
|
||||
|
|
@ -49,7 +49,7 @@ def get_group_by_count(doctype, current_filters, field):
|
|||
return frappe.db.get_list(doctype,
|
||||
filters=current_filters,
|
||||
group_by=field,
|
||||
fields=['count(*) as count', field + ' as name'],
|
||||
fields=['count(*) as count', '`{}` as name'.format(field)],
|
||||
order_by='count desc',
|
||||
limit=50,
|
||||
)
|
||||
|
|
|
|||
0
frappe/desk/page/leaderboard/__init__.py
Normal file
0
frappe/desk/page/leaderboard/__init__.py
Normal file
49
frappe/desk/page/leaderboard/leaderboard.css
Normal file
49
frappe/desk/page/leaderboard/leaderboard.css
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
.list-filters {
|
||||
overflow-y: hidden;
|
||||
padding: 5px
|
||||
}
|
||||
|
||||
.list-filter-item {
|
||||
min-width: 150px;
|
||||
float: left;
|
||||
margin: 5px;
|
||||
}
|
||||
|
||||
.list-item_content {
|
||||
flex: 1;
|
||||
padding-right: 15px;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.select-time, .select-doctype, .select-filter, .select-sort {
|
||||
background: #f0f4f7;
|
||||
}
|
||||
|
||||
.select-time:focus, .select-doctype:focus, .select-filter:focus, .select-sort:focus {
|
||||
background: #f0f4f7;
|
||||
}
|
||||
|
||||
.header-btn-base {
|
||||
border: none;
|
||||
outline: 0;
|
||||
vertical-align: middle;
|
||||
overflow: hidden;
|
||||
text-decoration: none;
|
||||
color: inherit;
|
||||
background-color: inherit;
|
||||
cursor: pointer;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.header-btn-round {
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.item-title-bold {
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.rank {
|
||||
max-width: 75px;
|
||||
}
|
||||
|
||||
376
frappe/desk/page/leaderboard/leaderboard.js
Normal file
376
frappe/desk/page/leaderboard/leaderboard.js
Normal file
|
|
@ -0,0 +1,376 @@
|
|||
frappe.pages["leaderboard"].on_page_load = (wrapper) => {
|
||||
frappe.leaderboard = new Leaderboard(wrapper);
|
||||
|
||||
$(wrapper).bind('show', ()=> {
|
||||
// Get which leaderboard to show
|
||||
let doctype = frappe.get_route()[1];
|
||||
frappe.leaderboard.show_leaderboard(doctype);
|
||||
});
|
||||
};
|
||||
|
||||
class Leaderboard {
|
||||
|
||||
constructor(parent) {
|
||||
frappe.ui.make_app_page({
|
||||
parent: parent,
|
||||
title: "Leaderboard",
|
||||
single_column: false
|
||||
});
|
||||
this.parent = parent;
|
||||
this.page = this.parent.page;
|
||||
this.page.sidebar.html(`<ul class="module-sidebar-nav overlay-sidebar nav nav-pills nav-stacked"></ul>`);
|
||||
this.$sidebar_list = this.page.sidebar.find('ul');
|
||||
|
||||
this.get_leaderboard_config();
|
||||
|
||||
}
|
||||
|
||||
get_leaderboard_config() {
|
||||
this.doctypes = [];
|
||||
this.filters = {};
|
||||
this.leaderboard_limit = 20;
|
||||
|
||||
frappe.xcall("frappe.desk.page.leaderboard.leaderboard.get_leaderboard_config").then(config => {
|
||||
this.leaderboard_config = config;
|
||||
for (let doctype in this.leaderboard_config) {
|
||||
this.doctypes.push(doctype);
|
||||
this.filters[doctype] = this.leaderboard_config[doctype].fields.map(field => {
|
||||
if (typeof field ==='object') {
|
||||
return field.label || field.fieldname;
|
||||
}
|
||||
return field;
|
||||
});
|
||||
}
|
||||
this.timespans = ["Week", "Month", "Quarter", "Year", "All Time"];
|
||||
|
||||
// for saving current selected filters
|
||||
const _initial_doctype = frappe.get_route()[1] || this.doctypes[0];
|
||||
const _initial_timespan = this.timespans[0];
|
||||
const _initial_filter = this.filters[_initial_doctype];
|
||||
|
||||
this.options = {
|
||||
selected_doctype: _initial_doctype,
|
||||
selected_filter: _initial_filter,
|
||||
selected_filter_item: _initial_filter[0],
|
||||
selected_timespan: _initial_timespan,
|
||||
};
|
||||
|
||||
this.message = null;
|
||||
this.make();
|
||||
});
|
||||
}
|
||||
|
||||
make() {
|
||||
|
||||
this.$container = $(`<div class="leaderboard page-main-content">
|
||||
<div class="leaderboard-graph"></div>
|
||||
<div class="leaderboard-list"></div>
|
||||
</div>`).appendTo(this.page.main);
|
||||
|
||||
this.$graph_area = this.$container.find(".leaderboard-graph");
|
||||
|
||||
this.doctypes.map(doctype => {
|
||||
this.get_sidebar_item(doctype).appendTo(this.$sidebar_list);
|
||||
});
|
||||
|
||||
this.setup_leaderboard_fields();
|
||||
|
||||
this.render_selected_doctype();
|
||||
|
||||
this.render_search_box();
|
||||
|
||||
// Get which leaderboard to show
|
||||
let doctype = frappe.get_route()[1];
|
||||
this.show_leaderboard(doctype);
|
||||
|
||||
}
|
||||
|
||||
setup_leaderboard_fields() {
|
||||
this.company_select = this.page.add_field({
|
||||
fieldname: "company",
|
||||
label: __("Company"),
|
||||
fieldtype: "Link",
|
||||
options: "Company",
|
||||
default: frappe.defaults.get_default("company"),
|
||||
reqd: 1,
|
||||
change: (e) => {
|
||||
this.options.selected_company = e.currentTarget.value;
|
||||
this.make_request();
|
||||
}
|
||||
});
|
||||
|
||||
this.timespan_select = this.page.add_select(__("Timespan"),
|
||||
this.timespans.map(d => {
|
||||
return {"label": __(d), value: d };
|
||||
})
|
||||
);
|
||||
|
||||
this.type_select = this.page.add_select(__("Field"),
|
||||
this.options.selected_filter.map(d => {
|
||||
return {"label": __(frappe.model.unscrub(d)), value: d };
|
||||
})
|
||||
);
|
||||
|
||||
this.timespan_select.on("change", (e) => {
|
||||
this.options.selected_timespan = e.currentTarget.value;
|
||||
this.make_request();
|
||||
});
|
||||
|
||||
this.type_select.on("change", (e) => {
|
||||
this.options.selected_filter_item = e.currentTarget.value;
|
||||
this.make_request();
|
||||
});
|
||||
}
|
||||
|
||||
render_selected_doctype() {
|
||||
|
||||
this.$sidebar_list.on("click", "li", (e)=> {
|
||||
let $li = $(e.currentTarget);
|
||||
let doctype = $li.find("span").attr("doctype-value");
|
||||
|
||||
this.options.selected_company = frappe.defaults.get_default("company");
|
||||
this.options.selected_doctype = doctype;
|
||||
this.options.selected_filter = this.filters[doctype];
|
||||
this.options.selected_filter_item = this.filters[doctype][0];
|
||||
|
||||
this.type_select.empty().add_options(
|
||||
this.options.selected_filter.map(d => {
|
||||
return {"label": __(frappe.model.unscrub(d)), value: d };
|
||||
})
|
||||
);
|
||||
if (this.leaderboard_config[this.options.selected_doctype].company_disabled) {
|
||||
$(this.parent).find("[data-original-title=Company]").hide();
|
||||
} else {
|
||||
$(this.parent).find("[data-original-title=Company]").show();
|
||||
}
|
||||
|
||||
this.$sidebar_list.find("li").removeClass("active");
|
||||
$li.addClass("active");
|
||||
|
||||
frappe.set_route("leaderboard", this.options.selected_doctype);
|
||||
this.make_request();
|
||||
});
|
||||
}
|
||||
|
||||
render_search_box() {
|
||||
|
||||
this.$search_box =
|
||||
$(`<div class="leaderboard-search col-md-3">
|
||||
<input type="text" placeholder="Search" class="form-control leaderboard-search-input input-sm">
|
||||
</div>`);
|
||||
|
||||
$(this.parent).find(".page-form").append(this.$search_box);
|
||||
}
|
||||
|
||||
setup_search(list_items) {
|
||||
let $search_input = this.$search_box.find(".leaderboard-search-input");
|
||||
|
||||
this.$search_box.on("keyup", ()=> {
|
||||
let text_filter = $search_input.val().toLowerCase();
|
||||
text_filter = text_filter.replace(/^\s+|\s+$/g, '');
|
||||
for (var i = 0; i < list_items.length; i++) {
|
||||
let text = list_items.eq(i).find(".list-id").text().trim().toLowerCase();
|
||||
|
||||
if (text.includes(text_filter)) {
|
||||
list_items.eq(i).css("display", "");
|
||||
} else {
|
||||
list_items.eq(i).css("display", "none");
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
show_leaderboard(doctype) {
|
||||
if (this.doctypes.length) {
|
||||
if (this.doctypes.includes(doctype)) {
|
||||
this.options.selected_doctype = doctype;
|
||||
this.$sidebar_list.find(`[doctype-value = "${this.options.selected_doctype}"]`).trigger("click");
|
||||
}
|
||||
|
||||
this.$search_box.find(".leaderboard-search-input").val("");
|
||||
frappe.set_route("leaderboard", this.options.selected_doctype);
|
||||
}
|
||||
}
|
||||
|
||||
make_request() {
|
||||
|
||||
frappe.model.with_doctype(this.options.selected_doctype, ()=> {
|
||||
this.get_leaderboard(this.get_leaderboard_data);
|
||||
});
|
||||
}
|
||||
|
||||
get_leaderboard(notify) {
|
||||
if (!this.options.selected_company) {
|
||||
frappe.throw(__("Please select Company"));
|
||||
}
|
||||
frappe.call(
|
||||
this.leaderboard_config[this.options.selected_doctype].method,
|
||||
{
|
||||
'from_date': this.get_from_date(),
|
||||
'timespan': this.options.selected_timespan,
|
||||
'company': this.options.selected_company,
|
||||
'field': this.options.selected_filter_item,
|
||||
'limit': this.leaderboard_limit,
|
||||
}
|
||||
).then(r => {
|
||||
let results = r.message || [];
|
||||
|
||||
let graph_items = results.slice(0, 10);
|
||||
|
||||
this.$graph_area.show().empty();
|
||||
let args = {
|
||||
data: {
|
||||
datasets: [
|
||||
{
|
||||
values: graph_items.map(d => d.value)
|
||||
}
|
||||
],
|
||||
labels: graph_items.map(d => d.name)
|
||||
},
|
||||
colors: ["light-green"],
|
||||
format_tooltip_x: d => d[this.options.selected_filter_item],
|
||||
type: "bar",
|
||||
height: 140
|
||||
};
|
||||
new frappe.Chart(".leaderboard-graph", args);
|
||||
|
||||
notify(this, r);
|
||||
});
|
||||
}
|
||||
|
||||
get_leaderboard_data(me, res) {
|
||||
if (res && res.message.length) {
|
||||
me.message = null;
|
||||
me.$container.find(".leaderboard-list").html(me.render_list_view(res.message));
|
||||
me.setup_search($(me.parent).find('.list-item-container'));
|
||||
} else {
|
||||
me.$graph_area.hide();
|
||||
me.message = __("No items found.");
|
||||
me.$container.find(".leaderboard-list").html(me.render_list_view());
|
||||
}
|
||||
}
|
||||
|
||||
render_list_view(items = []) {
|
||||
|
||||
var html =
|
||||
`${this.render_message()}
|
||||
<div class="result" style="${this.message ? "display: none;" : ""}">
|
||||
${this.render_result(items)}
|
||||
</div>`;
|
||||
|
||||
return $(html);
|
||||
}
|
||||
|
||||
render_result(items) {
|
||||
|
||||
var html =
|
||||
`${this.render_list_header()}
|
||||
${this.render_list_result(items)}`;
|
||||
return html;
|
||||
}
|
||||
|
||||
render_list_header() {
|
||||
const _selected_filter = this.options.selected_filter
|
||||
.map(i => frappe.model.unscrub(i));
|
||||
const fields = ["rank", "name", this.options.selected_filter_item];
|
||||
const filters = fields.map(filter => {
|
||||
const col = frappe.model.unscrub(filter);
|
||||
return (
|
||||
`<div class="leaderboard-item list-item_content ellipsis text-muted list-item__content--flex-2
|
||||
header-btn-base ${filter}
|
||||
${(col && _selected_filter.indexOf(col) !== -1) ? "text-right" : ""}">
|
||||
<span class="list-col-title ellipsis">
|
||||
${col}
|
||||
</span>
|
||||
</div>`
|
||||
);
|
||||
}).join("");
|
||||
|
||||
const html =
|
||||
`<div class="list-headers">
|
||||
<div class="list-item list-item--head" data-list-renderer="List">${filters}</div>
|
||||
</div>`;
|
||||
return html;
|
||||
}
|
||||
|
||||
render_list_result(items) {
|
||||
|
||||
let _html = items.map((item, index) => {
|
||||
const $value = $(this.get_item_html(item, index+1));
|
||||
const $item_container = $(`<div class="list-item-container">`).append($value);
|
||||
return $item_container[0].outerHTML;
|
||||
}).join("");
|
||||
|
||||
let html =
|
||||
`<div class="result-list">
|
||||
<div class="list-items">
|
||||
${_html}
|
||||
</div>
|
||||
</div>`;
|
||||
|
||||
return html;
|
||||
}
|
||||
|
||||
render_message() {
|
||||
|
||||
let html =
|
||||
`<div class="no-result text-center" style="${this.message ? "" : "display: none;"}">
|
||||
<div class="msg-box no-border">
|
||||
<p>No Item found</p>
|
||||
</div>
|
||||
</div>`;
|
||||
|
||||
return html;
|
||||
}
|
||||
|
||||
get_item_html(item, index) {
|
||||
const fields = this.leaderboard_config[this.options.selected_doctype].fields;
|
||||
const value = frappe.format(item.value, fields.find(field => {
|
||||
let fieldname = field.fieldname || field;
|
||||
return fieldname === this.options.selected_filter_item;
|
||||
}));
|
||||
|
||||
const link = `#Form/${this.options.selected_doctype}/${item.name}`;
|
||||
const name_html = item.formatted_name ?
|
||||
`<span class="text-muted ellipsis">${item.formatted_name}</span>`
|
||||
: `<a class="grey list-id ellipsis" href="${link}"> ${item.name} </a>`;
|
||||
const html =
|
||||
`<div class="list-item">
|
||||
<div class="list-item_content ellipsis list-item__content--flex-2 rank">
|
||||
<span class="text-muted ellipsis">${index}</span>
|
||||
</div>
|
||||
<div class="list-item_content ellipsis list-item__content--flex-2 name">
|
||||
${name_html}
|
||||
</div>
|
||||
<div class="list-item_content ellipsis list-item__content--flex-2 value text-right">
|
||||
<span class="text-muted ellipsis">${value}</span>
|
||||
</div>
|
||||
</div>`;
|
||||
|
||||
return html;
|
||||
}
|
||||
|
||||
get_sidebar_item(item) {
|
||||
return $(`<li class="strong module-sidebar-item">
|
||||
<a class="module-link">
|
||||
<span doctype-value="${item}">${ __(item) }</span></a>
|
||||
</li>`);
|
||||
}
|
||||
|
||||
get_from_date() {
|
||||
let timespan = this.options.selected_timespan.toLowerCase();
|
||||
let current_date = frappe.datetime.now_date();
|
||||
let date = '';
|
||||
if (timespan === "month") {
|
||||
date = frappe.datetime.add_months(current_date, -1);
|
||||
} else if (timespan === "quarter") {
|
||||
date = frappe.datetime.add_months(current_date, -3);
|
||||
} else if (timespan === "year") {
|
||||
date = frappe.datetime.add_months(current_date, -12);
|
||||
} else if (timespan === "week") {
|
||||
date = frappe.datetime.add_days(current_date, -7);
|
||||
}
|
||||
return date;
|
||||
}
|
||||
|
||||
}
|
||||
19
frappe/desk/page/leaderboard/leaderboard.json
Normal file
19
frappe/desk/page/leaderboard/leaderboard.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"content": null,
|
||||
"creation": "2017-06-06 02:54:24.785360",
|
||||
"docstatus": 0,
|
||||
"doctype": "Page",
|
||||
"idx": 0,
|
||||
"modified": "2019-09-27 17:44:51.909947",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Desk",
|
||||
"name": "leaderboard",
|
||||
"owner": "Administrator",
|
||||
"page_name": "leaderboard",
|
||||
"roles": [],
|
||||
"script": null,
|
||||
"standard": "Yes",
|
||||
"style": null,
|
||||
"system_page": 0,
|
||||
"title": "Leaderboard"
|
||||
}
|
||||
14
frappe/desk/page/leaderboard/leaderboard.py
Normal file
14
frappe/desk/page/leaderboard/leaderboard.py
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
# Copyright (c) 2017, Frappe Technologies Pvt. Ltd. and Contributors
|
||||
# MIT License. See license.txt
|
||||
|
||||
from __future__ import unicode_literals, print_function
|
||||
import frappe
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_leaderboard_config():
|
||||
leaderboard_config = frappe._dict()
|
||||
leaderboard_hooks = frappe.get_hooks('leaderboards')
|
||||
for hook in leaderboard_hooks:
|
||||
leaderboard_config.update(frappe.get_attr(hook)())
|
||||
|
||||
return leaderboard_config
|
||||
|
|
@ -4,11 +4,12 @@
|
|||
from __future__ import unicode_literals
|
||||
|
||||
import frappe
|
||||
|
||||
from frappe import _
|
||||
from frappe.desk.doctype.global_search_settings.global_search_settings import update_global_search_doctypes
|
||||
|
||||
def install():
|
||||
update_genders_and_salutations()
|
||||
update_global_search_doctypes()
|
||||
|
||||
@frappe.whitelist()
|
||||
def update_genders_and_salutations():
|
||||
|
|
|
|||
|
|
@ -54,7 +54,7 @@ def setup_complete(args):
|
|||
|
||||
# Setup complete: do not throw an exception, let the user continue to desk
|
||||
if cint(frappe.db.get_single_value('System Settings', 'setup_complete')):
|
||||
return
|
||||
return {'status': 'ok'}
|
||||
|
||||
args = parse_args(args)
|
||||
|
||||
|
|
|
|||
|
|
@ -18,6 +18,6 @@
|
|||
<div class="profile-links">
|
||||
<p><a class="edit-profile-link">{%=__("Edit Profile") %}</a></p>
|
||||
<p><a class="user-settings-link">{%=__("User Settings") %}</a></p>
|
||||
<p><a class="leaderboard-link" href="#social/users">{%=__("Leaderboard") %}</a></p>
|
||||
<p><a class="leaderboard-link" href="#leaderboard/User">{%=__("Leaderboard") %}</a></p>
|
||||
</div>
|
||||
</div>
|
||||
|
|
@ -261,13 +261,17 @@ def delete_bulk(doctype, items):
|
|||
@frappe.whitelist()
|
||||
@frappe.read_only()
|
||||
def get_sidebar_stats(stats, doctype, filters=[]):
|
||||
cat_tags = frappe.db.sql("""select `tag`.parent as `category`, `tag`.tag_name as `tag`
|
||||
from `tabTag Doc Category` as `docCat`
|
||||
INNER JOIN `tabTag` as `tag` on `tag`.parent = `docCat`.parent
|
||||
where `docCat`.tagdoc=%s
|
||||
ORDER BY `tag`.parent asc, `tag`.idx""", doctype, as_dict=1)
|
||||
|
||||
return {"defined_cat":cat_tags, "stats":get_stats(stats, doctype, filters)}
|
||||
if not frappe.cache().hget("tags_count", doctype):
|
||||
tags = [tag.name for tag in frappe.get_list("Tag")]
|
||||
_user_tags = []
|
||||
for tag in tags:
|
||||
count = frappe.db.count("Tag Link", filters={"document_type": doctype, "tag": tag})
|
||||
if count > 0:
|
||||
_user_tags.append([tag, count])
|
||||
frappe.cache().hset("tags_count", doctype, _user_tags)
|
||||
|
||||
return {"stats": {"_user_tags": frappe.cache().hget("tags_count", doctype)}}
|
||||
|
||||
@frappe.whitelist()
|
||||
@frappe.read_only()
|
||||
|
|
|
|||
|
|
@ -1,127 +0,0 @@
|
|||
# Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
|
||||
# MIT License. See license.txt
|
||||
|
||||
from __future__ import unicode_literals, print_function
|
||||
import json
|
||||
"""
|
||||
Server side functions for tagging.
|
||||
|
||||
- Tags can be added to any record (doctype, name) in the system.
|
||||
- Items are filtered by tags
|
||||
- Top tags are shown in the sidebar (?)
|
||||
- Tags are also identified by the tag_fields property of the DocType
|
||||
|
||||
Discussion:
|
||||
|
||||
Tags are shown in the docbrowser and ideally where-ever items are searched.
|
||||
There should also be statistics available for tags (like top tags etc)
|
||||
|
||||
|
||||
Design:
|
||||
|
||||
- free tags (user_tags) are stored in __user_tags
|
||||
- doctype tags are set in tag_fields property of the doctype
|
||||
- top tags merges the tags from both the lists (only refreshes once an hour (max))
|
||||
|
||||
"""
|
||||
|
||||
import frappe
|
||||
from frappe.utils.global_search import update_global_search
|
||||
|
||||
def check_user_tags(dt):
|
||||
"if the user does not have a tags column, then it creates one"
|
||||
try:
|
||||
frappe.db.sql("select `_user_tags` from `tab%s` limit 1" % dt)
|
||||
except Exception as e:
|
||||
if frappe.db.is_column_missing(e):
|
||||
DocTags(dt).setup()
|
||||
|
||||
@frappe.whitelist()
|
||||
def add_tag(tag, dt, dn, color=None):
|
||||
"adds a new tag to a record, and creates the Tag master"
|
||||
DocTags(dt).add(dn, tag)
|
||||
|
||||
return tag
|
||||
|
||||
@frappe.whitelist()
|
||||
def remove_tag(tag, dt, dn):
|
||||
"removes tag from the record"
|
||||
DocTags(dt).remove(dn, tag)
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_tagged_docs(doctype, tag):
|
||||
frappe.has_permission(doctype, throw=True)
|
||||
|
||||
return frappe.db.sql("""SELECT name
|
||||
FROM `tab{0}`
|
||||
WHERE _user_tags LIKE '%{1}%'""".format(doctype, tag))
|
||||
|
||||
@frappe.whitelist()
|
||||
def get_tags(doctype, txt, cat_tags):
|
||||
tags = json.loads(cat_tags)
|
||||
try:
|
||||
for _user_tags in frappe.db.sql_list("""select DISTINCT `_user_tags`
|
||||
from `tab{0}`
|
||||
where _user_tags like {1}
|
||||
limit 50""".format(doctype, frappe.db.escape('%' + txt + '%'))):
|
||||
tags.extend(_user_tags[1:].split(","))
|
||||
except Exception as e:
|
||||
if not frappe.db.is_column_missing(e): raise
|
||||
return sorted(filter(lambda t: t and txt.lower() in t.lower(), list(set(tags))))
|
||||
|
||||
class DocTags:
|
||||
"""Tags for a particular doctype"""
|
||||
def __init__(self, dt):
|
||||
self.dt = dt
|
||||
|
||||
def get_tag_fields(self):
|
||||
"""returns tag_fields property"""
|
||||
return frappe.db.get_value('DocType', self.dt, 'tag_fields')
|
||||
|
||||
def get_tags(self, dn):
|
||||
"""returns tag for a particular item"""
|
||||
return (frappe.db.get_value(self.dt, dn, '_user_tags', ignore=1) or '').strip()
|
||||
|
||||
def add(self, dn, tag):
|
||||
"""add a new user tag"""
|
||||
tl = self.get_tags(dn).split(',')
|
||||
if not tag in tl:
|
||||
tl.append(tag)
|
||||
self.update(dn, tl)
|
||||
|
||||
def remove(self, dn, tag):
|
||||
"""remove a user tag"""
|
||||
tl = self.get_tags(dn).split(',')
|
||||
self.update(dn, filter(lambda x:x.lower()!=tag.lower(), tl))
|
||||
|
||||
def remove_all(self, dn):
|
||||
"""remove all user tags (call before delete)"""
|
||||
self.update(dn, [])
|
||||
|
||||
def update(self, dn, tl):
|
||||
"""updates the _user_tag column in the table"""
|
||||
|
||||
if not tl:
|
||||
tags = ''
|
||||
else:
|
||||
tl = list(set(filter(lambda x: x, tl)))
|
||||
tags = ',' + ','.join(tl)
|
||||
try:
|
||||
frappe.db.sql("update `tab%s` set _user_tags=%s where name=%s" % \
|
||||
(self.dt,'%s','%s'), (tags , dn))
|
||||
doc= frappe.get_doc(self.dt, dn)
|
||||
update_global_search(doc)
|
||||
except Exception as e:
|
||||
if frappe.db.is_column_missing(e):
|
||||
if not tags:
|
||||
# no tags, nothing to do
|
||||
return
|
||||
|
||||
self.setup()
|
||||
self.update(dn, tl)
|
||||
else: raise
|
||||
|
||||
def setup(self):
|
||||
"""adds the _user_tags column if not exists"""
|
||||
from frappe.database.schema import add_column
|
||||
add_column(self.dt, "_user_tags", "Data")
|
||||
|
|
@ -21,6 +21,7 @@
|
|||
"use_imap",
|
||||
"email_server",
|
||||
"use_ssl",
|
||||
"incoming_port",
|
||||
"attachment_limit",
|
||||
"append_to",
|
||||
"default_incoming",
|
||||
|
|
@ -372,8 +373,8 @@
|
|||
"read_only": 1
|
||||
},
|
||||
{
|
||||
"fieldname": "section_break_12",
|
||||
"fieldtype": "Section Break"
|
||||
"fieldname": "section_break_12",
|
||||
"fieldtype": "Section Break"
|
||||
},
|
||||
{
|
||||
"default": "0",
|
||||
|
|
@ -381,10 +382,17 @@
|
|||
"fieldname": "enable_automatic_linking",
|
||||
"fieldtype": "Check",
|
||||
"label": "Enable Automatic Linking in Documents"
|
||||
},
|
||||
{
|
||||
"depends_on": "eval:!doc.domain && doc.enable_incoming",
|
||||
"description": "If non-standard port (e.g. POP3: 995/110, IMAP: 993/143)",
|
||||
"fieldname": "incoming_port",
|
||||
"fieldtype": "Data",
|
||||
"label": "Port"
|
||||
}
|
||||
],
|
||||
"icon": "fa fa-inbox",
|
||||
"modified": "2019-06-15 19:03:55.283524",
|
||||
"modified": "2019-08-31 18:01:15.568831",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Email",
|
||||
"name": "Email Account",
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ from frappe.utils.background_jobs import enqueue, get_jobs
|
|||
from frappe.core.doctype.communication.email import set_incoming_outgoing_accounts
|
||||
from frappe.utils.scheduler import log
|
||||
from frappe.utils.html_utils import clean_email_html
|
||||
|
||||
from frappe.email.utils import get_port
|
||||
|
||||
class SentEmailInInbox(Exception): pass
|
||||
|
||||
|
|
@ -117,7 +117,7 @@ class EmailAccount(Document):
|
|||
fields = [
|
||||
"name as domain", "use_imap", "email_server",
|
||||
"use_ssl", "smtp_server", "use_tls",
|
||||
"smtp_port"
|
||||
"smtp_port", "incoming_port"
|
||||
]
|
||||
return frappe.db.get_value("Email Domain", domain[1], fields, as_dict=True)
|
||||
except Exception:
|
||||
|
|
@ -153,6 +153,7 @@ class EmailAccount(Document):
|
|||
"use_imap": self.use_imap,
|
||||
"email_sync_rule": email_sync_rule,
|
||||
"uid_validity": self.uidvalidity,
|
||||
"incoming_port": get_port(self),
|
||||
"initial_sync_count": self.initial_sync_count or 100
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -1,443 +1,130 @@
|
|||
{
|
||||
"allow_copy": 0,
|
||||
"allow_import": 0,
|
||||
"allow_rename": 0,
|
||||
"autoname": "field:domain_name",
|
||||
"beta": 0,
|
||||
"creation": "2016-03-29 10:50:48.848239",
|
||||
"custom": 0,
|
||||
"docstatus": 0,
|
||||
"doctype": "DocType",
|
||||
"document_type": "Setup",
|
||||
"editable_grid": 0,
|
||||
"autoname": "field:domain_name",
|
||||
"creation": "2016-03-29 10:50:48.848239",
|
||||
"doctype": "DocType",
|
||||
"document_type": "Setup",
|
||||
"engine": "InnoDB",
|
||||
"field_order": [
|
||||
"email_settings",
|
||||
"domain_name",
|
||||
"email_id",
|
||||
"mailbox_settings",
|
||||
"email_server",
|
||||
"use_imap",
|
||||
"use_ssl",
|
||||
"incoming_port",
|
||||
"attachment_limit",
|
||||
"append_to",
|
||||
"outgoing_mail_settings",
|
||||
"smtp_server",
|
||||
"use_tls",
|
||||
"smtp_port"
|
||||
],
|
||||
"fields": [
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "email_settings",
|
||||
"fieldtype": "Section Break",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"fieldname": "email_settings",
|
||||
"fieldtype": "Section Break"
|
||||
},
|
||||
{
|
||||
"fieldname": "domain_name",
|
||||
"fieldtype": "Data",
|
||||
"label": "domain name",
|
||||
"read_only": 1,
|
||||
"unique": 0
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"description": "",
|
||||
"fieldname": "domain_name",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "domain name",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 1,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"fieldname": "email_id",
|
||||
"fieldtype": "Data",
|
||||
"label": "Example Email Address",
|
||||
"options": "Email",
|
||||
"reqd": 1
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"description": "",
|
||||
"fieldname": "email_id",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Example Email Address",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"options": "Email",
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 1,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"fieldname": "mailbox_settings",
|
||||
"fieldtype": "Section Break"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "mailbox_settings",
|
||||
"fieldtype": "Section Break",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"description": "e.g. pop.gmail.com / imap.gmail.com",
|
||||
"fieldname": "email_server",
|
||||
"fieldtype": "Data",
|
||||
"label": "Email Server",
|
||||
"reqd": 1
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"description": "e.g. pop.gmail.com / imap.gmail.com",
|
||||
"fieldname": "email_server",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Email Server",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 1,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"default": "0",
|
||||
"fieldname": "use_imap",
|
||||
"fieldtype": "Check",
|
||||
"label": "Use IMAP"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"fieldname": "use_imap",
|
||||
"fieldtype": "Check",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Use IMAP",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"default": "0",
|
||||
"fieldname": "use_ssl",
|
||||
"fieldtype": "Check",
|
||||
"label": "Use SSL"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"fieldname": "use_ssl",
|
||||
"fieldtype": "Check",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Use SSL",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"default": "1",
|
||||
"description": "Ignore attachments over this size",
|
||||
"fieldname": "attachment_limit",
|
||||
"fieldtype": "Int",
|
||||
"label": "Attachment Limit (MB)"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"default": "1",
|
||||
"depends_on": "",
|
||||
"description": "Ignore attachments over this size",
|
||||
"fieldname": "attachment_limit",
|
||||
"fieldtype": "Int",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Attachment Limit (MB)",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"description": "Append as communication against this DocType (must have fields, \"Status\", \"Subject\")",
|
||||
"fieldname": "append_to",
|
||||
"fieldtype": "Link",
|
||||
"hidden": 1,
|
||||
"in_list_view": 1,
|
||||
"label": "Append To",
|
||||
"options": "DocType"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"description": "Append as communication against this DocType (must have fields, \"Status\", \"Subject\")",
|
||||
"fieldname": "append_to",
|
||||
"fieldtype": "Link",
|
||||
"hidden": 1,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 1,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Append To",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"options": "DocType",
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"fieldname": "outgoing_mail_settings",
|
||||
"fieldtype": "Section Break"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"fieldname": "outgoing_mail_settings",
|
||||
"fieldtype": "Section Break",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"description": "e.g. smtp.gmail.com",
|
||||
"fieldname": "smtp_server",
|
||||
"fieldtype": "Data",
|
||||
"label": "SMTP Server",
|
||||
"reqd": 1
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"description": "e.g. smtp.gmail.com",
|
||||
"fieldname": "smtp_server",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "SMTP Server",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 1,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"default": "0",
|
||||
"fieldname": "use_tls",
|
||||
"fieldtype": "Check",
|
||||
"label": "Use TLS"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"fieldname": "use_tls",
|
||||
"fieldtype": "Check",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Use TLS",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
},
|
||||
"description": "If non standard port (e.g. 587)",
|
||||
"fieldname": "smtp_port",
|
||||
"fieldtype": "Data",
|
||||
"label": "Port"
|
||||
},
|
||||
{
|
||||
"allow_on_submit": 0,
|
||||
"bold": 0,
|
||||
"collapsible": 0,
|
||||
"columns": 0,
|
||||
"depends_on": "",
|
||||
"description": "If non standard port (e.g. 587)",
|
||||
"fieldname": "smtp_port",
|
||||
"fieldtype": "Data",
|
||||
"hidden": 0,
|
||||
"ignore_user_permissions": 0,
|
||||
"ignore_xss_filter": 0,
|
||||
"in_filter": 0,
|
||||
"in_list_view": 0,
|
||||
"in_standard_filter": 0,
|
||||
"label": "Port",
|
||||
"length": 0,
|
||||
"no_copy": 0,
|
||||
"permlevel": 0,
|
||||
"precision": "",
|
||||
"print_hide": 0,
|
||||
"print_hide_if_no_value": 0,
|
||||
"read_only": 0,
|
||||
"remember_last_selected_value": 0,
|
||||
"report_hide": 0,
|
||||
"reqd": 0,
|
||||
"search_index": 0,
|
||||
"set_only_once": 0,
|
||||
"unique": 0
|
||||
"description": "If non-standard port (e.g. POP3: 995/110, IMAP: 993/143)",
|
||||
"fieldname": "incoming_port",
|
||||
"fieldtype": "Data",
|
||||
"label": "Port"
|
||||
}
|
||||
],
|
||||
"hide_heading": 0,
|
||||
"hide_toolbar": 0,
|
||||
"icon": "icon-inbox",
|
||||
"idx": 0,
|
||||
"image_view": 0,
|
||||
"in_create": 0,
|
||||
|
||||
"is_submittable": 0,
|
||||
"issingle": 0,
|
||||
"istable": 0,
|
||||
"max_attachments": 0,
|
||||
"modified": "2016-12-23 13:31:58.408528",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Email",
|
||||
"name": "Email Domain",
|
||||
"name_case": "",
|
||||
"owner": "Administrator",
|
||||
],
|
||||
"icon": "icon-inbox",
|
||||
"modified": "2019-10-09 17:56:48.834704",
|
||||
"modified_by": "Administrator",
|
||||
"module": "Email",
|
||||
"name": "Email Domain",
|
||||
"owner": "Administrator",
|
||||
"permissions": [
|
||||
{
|
||||
"amend": 0,
|
||||
"apply_user_permissions": 0,
|
||||
"cancel": 0,
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"email": 0,
|
||||
"export": 0,
|
||||
"if_owner": 0,
|
||||
"import": 0,
|
||||
"is_custom": 0,
|
||||
"permlevel": 0,
|
||||
"print": 0,
|
||||
"read": 1,
|
||||
"report": 0,
|
||||
"role": "System Manager",
|
||||
"set_user_permissions": 1,
|
||||
"share": 1,
|
||||
"submit": 0,
|
||||
"create": 1,
|
||||
"delete": 1,
|
||||
"read": 1,
|
||||
"role": "System Manager",
|
||||
"set_user_permissions": 1,
|
||||
"share": 1,
|
||||
"write": 1
|
||||
}
|
||||
],
|
||||
"quick_entry": 0,
|
||||
"read_only": 0,
|
||||
"read_only_onload": 0,
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC",
|
||||
"track_seen": 0
|
||||
}
|
||||
],
|
||||
"sort_field": "modified",
|
||||
"sort_order": "DESC"
|
||||
}
|
||||
|
|
|
|||
|
|
@ -8,13 +8,13 @@ from frappe import _
|
|||
from frappe.model.document import Document
|
||||
from frappe.utils import validate_email_address ,cint
|
||||
import imaplib,poplib,smtplib
|
||||
from frappe.email.utils import get_port
|
||||
|
||||
class EmailDomain(Document):
|
||||
def autoname(self):
|
||||
if self.domain_name:
|
||||
self.name = self.domain_name
|
||||
|
||||
|
||||
def validate(self):
|
||||
"""Validate email id and check POP3/IMAP and SMTP connections is enabled."""
|
||||
if self.email_id:
|
||||
|
|
@ -27,15 +27,15 @@ class EmailDomain(Document):
|
|||
try:
|
||||
if self.use_imap:
|
||||
if self.use_ssl:
|
||||
test = imaplib.IMAP4_SSL(self.email_server)
|
||||
test = imaplib.IMAP4_SSL(self.email_server, port=get_port(self))
|
||||
else:
|
||||
test = imaplib.IMAP4(self.email_server)
|
||||
test = imaplib.IMAP4(self.email_server, port=get_port(self))
|
||||
|
||||
else:
|
||||
if self.use_ssl:
|
||||
test = poplib.POP3_SSL(self.email_server)
|
||||
test = poplib.POP3_SSL(self.email_server, port=get_port(self))
|
||||
else:
|
||||
test = poplib.POP3(self.email_server)
|
||||
test = poplib.POP3(self.email_server, port=get_port(self))
|
||||
|
||||
except Exception:
|
||||
frappe.throw(_("Incoming email account not correct"))
|
||||
|
|
@ -78,4 +78,3 @@ class EmailDomain(Document):
|
|||
frappe.msgprint(email_account.name)
|
||||
frappe.throw(e)
|
||||
return None
|
||||
|
||||
|
|
|
|||
|
|
@ -143,6 +143,8 @@ def get_context(context):
|
|||
|
||||
attachments = self.get_attachment(doc)
|
||||
recipients, cc, bcc = self.get_list_of_recipients(doc, context)
|
||||
if not (recipients or cc or bcc):
|
||||
return
|
||||
sender = None
|
||||
if self.sender and self.sender_email:
|
||||
sender = formataddr((self.sender, self.sender_email))
|
||||
|
|
|
|||
|
|
@ -428,6 +428,7 @@ def send_one(email, smtpserver=None, auto_commit=True, now=False, from_test=Fals
|
|||
smtplib.SMTPConnectError,
|
||||
smtplib.SMTPHeloError,
|
||||
smtplib.SMTPAuthenticationError,
|
||||
smtplib.SMTPRecipientsRefused,
|
||||
JobTimeoutException):
|
||||
|
||||
# bad connection/timeout, retry later
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue