* [start] postgres * [wip] started refactoring db_schema * Add psycopg2 to requirements.txt * Add support for Postgres SQL - Separate frameworkSQL, database, schema, setup_db file for mariaDB and postgres - WIP * Remove quotes from sql to make it compatible with postgres as well * Moved some code from db_schema to database.py * Move code from db_schema to schema.py Add other required refactoring * Add schema chages * Remove redundant code in file * Add invalid column name exception class to exceptions.py * Add back tick in query wherever needed and replace ifnull with coalesce * Update get_column_description code in database.py file * Remove a print statement * Add keys to get on_duplicate query * Add bactick wherever necessary - Remove db_schema.py file * Remove DATE_SUB as it is incompatible with postgres - Fix prepare_filter_condition * Add backtick and quotes wherever necessary - Move get_database_size to frappe.db namespace - fix some left out bugs and errors * Add code to create key and unique index - added mysql and posgres in their respective database.py * Add more bacticks in queries and fix some errors - Pass keys to on_duplicate_update method - Replace MONTH with EXTRACT function - Remove DATEDIFF and CURDATE usage * Cast state value to int in toggle_two_factor_auth - since two_factor_auth has the datatype of Int * Refactor - Replace Timediff with normal arithmetic operator - Add MAX_COLUMN_LENGTH - Remove Redundant code - Add regexp character constant - Move create_help_table to database.py - Add get_full_text_search_condition method - Inherit MariaDBTable from DBTable * Replace Database instance with get_db method * Move db_manager to separate file * Refactor - Remove some unwanted code - Separate alter table code for postgres and mysql - Replace data_type with column_type in database.py * Make fulltext search changes in global_search.py * Add empty string check * Add root_password to site config * Create cli command for postgres console * Move setup of help database to setup_db.py * Add get_database_list method * Fix exception handling - Replace bad_field handler with missing_column handler * Fix tests and sql queries * Fix import error * Fix typo db -> database * Fix error with make_table in help.py * Try test for postgres * Remove pyhton 2.7 version to try postgres travis test * Add test fixes * Add db_type to the config of test_site_postgres * Enable query debug to check the reason for travis fail * Add backticks to check if the test passes * Update travis.yml - Add postgres addon * Try appending 'd_' to hash for db_name - since postgres does not support dbname starting with a number * Try adding db_type for global help to make travis work * Add print statements to debug travis failure * Enable transaction and remove debug flag * Fix help table creation query (postgres) * Fix import issue * Add some checks to prevent errors - Some doctypes used to get called even before they are created * Try fixes * Update travis config * Fix create index for help table * Remove unused code * Fix queries and update travis config * Fix ifnull replace logic (regex) * Add query fixes and code cleanup * Fix typo - get_column_description -> get_table_columns_description * Fix tests - Replace double quotes in query with single quote * Replace psycopg2 with psycopg2-binary to avoid warnings - http://initd.org/psycopg/docs/install.html#binary-install-from-pypi * Add multisql api * Add few multisql queries * Remove print statements * Remove get_fulltext_search_condition method and replace with multi query * Remove text slicing in create user * Set default for 'values' argument in multisql * Fix incorrect queries and remove few debug flags - Fix multisql bug * Force delete user to fix test - Fix Import error - Fix incorrect query * Fix query builder bug * Fix bad query * Fix query (minor) * Convert boolean text to int since is_private has datatype of int - Some query changes like removed double quotes and replace with interpolated string to pass multiple value pass in one of the query * Extend database class from an object to support python 2 * Fix query - Add quotes around value passed to the query for variable comparision * Try setting host_name for each test site - To avoid "RemoteDisconnected" error while testing data migration test - Update travis.yml to add hosts - Remove unwanted commit in setup_help_database * Set site hostname to data migration connector (in test file) - To connect the same site host * Fix duplicate entry issue - the problem is in naming series file. In previous commits I unknowingly changed a part of a series query due to which series were not getting reset * Replace few sql queries with orm methods * Fix codacy * Fix 'Doctype Sessions not found' issue * Fix bugs induced during codacy fixes * Fix Notification Test - Use ORM instead of raw sql * Set Date fallback value to 0001-01-01 - 0000-00-00 is invalid date in Postgres - 0001-01-01 works in both * Fix date filter method * Replace double quotes with single quote for literal value * Remove print statement * Replace double quotes with single * Fix tests - Replace few raw sql with ORM * Separate query for postgres - update_fields_to_fetch_query * Fix tests - replace locate with strpos for postgres * Fix tests - Skip test for datediff - convert bytes to str in escape method * Remove TestBot * Skip fieldname extraction * Replace docshare raw sql with ORM * Fix typo * Fix ancestor query test * Fix test data migration * Remove hardcoded hostname * Add default option and option list for db_type * Remove frappe.async module * Remove a debug flag from test * Fix codacy * fix import issue * Convert classmethod to static method * Convert few instance methods to static methods * Remove some unused imports * Fix codacy - Add exception type - Replace few instance methods with static methods - Remove unsued import * Fix codacy * Remove unused code * Remove some unused codes - Convert some instance methods to static function * Fix a issue with query modification * Fix add_index query * Fix query * Fix update_auth patch * Fix a issue with exception handling * Add try catch to a reload_doc * Add try-catch to file_manager_hook patch * import update_gravatar to set_user_gravatar patch * Undo all the wrong patch fixes * Fix db_setup code 😪 - previously it was not restoring db from source SQL which is why few old patched were breaking (because they were getting different schema structure) * Fix typo ! * Fix exception(is_missing_column) handling * Add deleted code - This code is only used in a erpnext patch. Can be moved to that patch file * Fix codacy * Replace a mariadb specific function in a query used in validate_series * Remove a debug flag * Revert changes (rename_parent_and_child) * Fix validate_one_root method * Fix date format issue * Fix codacy - Disable a pylint for variable argument warning - Convert an instance method to static method * Add bandit.yml The Codacy seems to use Bandit which generates warning for every subprocess import and its usage during pytest Since we have carefully used subprocess (avoided user input), warnings needs to be avoided. This can be removed if we have any alternative for subprocess usage. * Skip start_process_with_partial_path check * Fix typo * Add python 2.7 test * Move python versions in travis.yml * Add python versions to jobs * Overwrite python version inheritance for postgres in travis.yml * Add quotes around python version in .travis.yml * Add quotes around the name of the job * Try a travis fix * Try .travis.yml fix * Import missing subprocess * Refactor travis.yml * Refactor travis.yml - move install and tests commands to separate files - Use matrix to build combination of python version and db type * Make install.sh and run-tests.sh executable * Add sudo required to travis.yml to allow sudo cmmands in shell files * Load nvm * Remove verbose flag from scripts * Remove command-trace-print flag * Change to build dir in before script * Add absolute path for scripts * Fix tests * Fix typo * Fix codacy - fixes - "echo won't expand escape sequences." warning * Append (_) underscore instead of 'd' for db_name * Remove printf and use mysql execute flag
309 lines
No EOL
9.2 KiB
Python
309 lines
No EOL
9.2 KiB
Python
from __future__ import unicode_literals
|
|
|
|
import re
|
|
import frappe
|
|
import psycopg2
|
|
import psycopg2.extensions
|
|
from six import string_types
|
|
from frappe.utils import cstr
|
|
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
|
|
|
|
from frappe.database.database import Database
|
|
from frappe.database.postgres.schema import PostgresTable
|
|
|
|
# cast decimals as floats
|
|
DEC2FLOAT = psycopg2.extensions.new_type(
|
|
psycopg2.extensions.DECIMAL.values,
|
|
'DEC2FLOAT',
|
|
lambda value, curs: float(value) if value is not None else None)
|
|
|
|
psycopg2.extensions.register_type(DEC2FLOAT)
|
|
|
|
class PostgresDatabase(Database):
|
|
ProgrammingError = psycopg2.ProgrammingError
|
|
OperationalError = psycopg2.OperationalError
|
|
InternalError = psycopg2.InternalError
|
|
SQLError = psycopg2.ProgrammingError
|
|
DataError = psycopg2.DataError
|
|
InterfaceError = psycopg2.InterfaceError
|
|
REGEX_CHARACTER = '~'
|
|
|
|
def setup_type_map(self):
|
|
self.type_map = {
|
|
'Currency': ('decimal', '18,6'),
|
|
'Int': ('bigint', None),
|
|
'Long Int': ('bigint', None), # convert int to bigint if length is more than 11
|
|
'Float': ('decimal', '18,6'),
|
|
'Percent': ('decimal', '18,6'),
|
|
'Check': ('smallint', None),
|
|
'Small Text': ('text', ''),
|
|
'Long Text': ('text', ''),
|
|
'Code': ('text', ''),
|
|
'Text Editor': ('text', ''),
|
|
'Date': ('date', ''),
|
|
'Datetime': ('timestamp', None),
|
|
'Time': ('time', '6'),
|
|
'Text': ('text', ''),
|
|
'Data': ('varchar', self.VARCHAR_LEN),
|
|
'Link': ('varchar', self.VARCHAR_LEN),
|
|
'Dynamic Link': ('varchar', self.VARCHAR_LEN),
|
|
'Password': ('varchar', self.VARCHAR_LEN),
|
|
'Select': ('varchar', self.VARCHAR_LEN),
|
|
'Read Only': ('varchar', self.VARCHAR_LEN),
|
|
'Attach': ('text', ''),
|
|
'Attach Image': ('text', ''),
|
|
'Signature': ('text', ''),
|
|
'Color': ('varchar', self.VARCHAR_LEN),
|
|
'Barcode': ('text', ''),
|
|
'Geolocation': ('text', '')
|
|
}
|
|
|
|
def get_connection(self):
|
|
# warnings.filterwarnings('ignore', category=psycopg2.Warning)
|
|
conn = psycopg2.connect('host={} dbname={}'.format(self.host, self.user))
|
|
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT) # TODO: Remove this
|
|
# conn = psycopg2.connect('host={} dbname={} user={} password={}'.format(self.host,
|
|
# self.user, self.user, self.password))
|
|
|
|
return conn
|
|
|
|
def escape(self, s, percent=True):
|
|
"""Excape quotes and percent in given string."""
|
|
if isinstance(s, bytes):
|
|
s = s.decode('utf-8')
|
|
|
|
if percent:
|
|
s = s.replace("%", "%%")
|
|
|
|
s = s.encode('utf-8')
|
|
|
|
return str(psycopg2.extensions.QuotedString(s))
|
|
|
|
def get_database_size(self):
|
|
''''Returns database size in MB'''
|
|
db_size = self.sql("SELECT (pg_database_size(%s) / 1024 / 1024) as database_size",
|
|
self.db_name, as_dict=True)
|
|
return db_size[0].get('database_size')
|
|
|
|
# pylint: disable=W0221
|
|
def sql(self, *args, **kwargs):
|
|
if len(args):
|
|
# since tuple is immutable
|
|
args = list(args)
|
|
args[0] = modify_query(args[0])
|
|
args = tuple(args)
|
|
elif kwargs.get('query'):
|
|
kwargs['query'] = modify_query(kwargs.get('query'))
|
|
|
|
return super(PostgresDatabase, self).sql(*args, **kwargs)
|
|
|
|
def get_tables(self):
|
|
return [d[0] for d in self.sql("""select table_name
|
|
from information_schema.tables
|
|
where table_catalog='{0}'
|
|
and table_type = 'BASE TABLE'
|
|
and table_schema='public'""".format(frappe.conf.db_name))]
|
|
|
|
def format_date(self, date):
|
|
if not date:
|
|
return '0001-01-01::DATE'
|
|
|
|
if isinstance(date, frappe.string_types):
|
|
if ':' not in date:
|
|
date = date + '::DATE'
|
|
else:
|
|
date = date.strftime('%Y-%m-%d') + '::DATE'
|
|
|
|
return date
|
|
|
|
# column type
|
|
@staticmethod
|
|
def is_type_number(code):
|
|
return code == psycopg2.NUMBER
|
|
|
|
@staticmethod
|
|
def is_type_datetime(code):
|
|
return code == psycopg2.DATETIME
|
|
|
|
# exception type
|
|
@staticmethod
|
|
def is_deadlocked(e):
|
|
return e.pgcode == '40P01'
|
|
|
|
@staticmethod
|
|
def is_timedout(e):
|
|
# http://initd.org/psycopg/docs/extensions.html?highlight=datatype#psycopg2.extensions.QueryCanceledError
|
|
return isinstance(e, psycopg2.extensions.QueryCanceledError)
|
|
|
|
@staticmethod
|
|
def is_table_missing(e):
|
|
return e.pgcode == '42P01'
|
|
|
|
@staticmethod
|
|
def is_missing_column(e):
|
|
return e.pgcode == '42703'
|
|
|
|
@staticmethod
|
|
def is_access_denied(e):
|
|
return e.pgcode == '42501'
|
|
|
|
@staticmethod
|
|
def cant_drop_field_or_key(e):
|
|
return e.pgcode.startswith('23')
|
|
|
|
@staticmethod
|
|
def is_duplicate_entry(e):
|
|
return e.pgcode == '23505'
|
|
|
|
@staticmethod
|
|
def is_primary_key_violation(e):
|
|
return e.pgcode == '23505' and '_pkey' in cstr(e.args[0])
|
|
|
|
@staticmethod
|
|
def is_unique_key_violation(e):
|
|
return e.pgcode == '23505' and '_key' in cstr(e.args[0])
|
|
|
|
@staticmethod
|
|
def is_duplicate_fieldname(e):
|
|
return e.pgcode == '42701'
|
|
|
|
def create_auth_table(self):
|
|
self.sql_ddl("""create table if not exists "__Auth" (
|
|
"doctype" VARCHAR(140) NOT NULL,
|
|
"name" VARCHAR(255) NOT NULL,
|
|
"fieldname" VARCHAR(140) NOT NULL,
|
|
"password" VARCHAR(255) NOT NULL,
|
|
"encrypted" INT NOT NULL DEFAULT 0,
|
|
PRIMARY KEY ("doctype", "name", "fieldname")
|
|
)""")
|
|
|
|
def create_global_search_table(self):
|
|
if not '__global_search' in self.get_tables():
|
|
self.sql('''create table "__global_search"(
|
|
doctype varchar(100),
|
|
name varchar({0}),
|
|
title varchar({0}),
|
|
content text,
|
|
route varchar({0}),
|
|
published int not null default 0,
|
|
unique (doctype, name))'''.format(self.VARCHAR_LEN))
|
|
|
|
def create_user_settings_table(self):
|
|
self.sql_ddl("""create table if not exists "__UserSettings" (
|
|
"user" VARCHAR(180) NOT NULL,
|
|
"doctype" VARCHAR(180) NOT NULL,
|
|
"data" TEXT,
|
|
UNIQUE ("user", "doctype")
|
|
)""")
|
|
|
|
def create_help_table(self):
|
|
self.sql('''CREATE TABLE "help"(
|
|
"path" varchar(255),
|
|
"content" text,
|
|
"title" text,
|
|
"intro" text,
|
|
"full_path" text)''')
|
|
self.sql('''CREATE INDEX IF NOT EXISTS "help_index" ON "help" ("path")''')
|
|
|
|
def updatedb(self, doctype, meta=None):
|
|
"""
|
|
Syncs a `DocType` to the table
|
|
* creates if required
|
|
* updates columns
|
|
* updates indices
|
|
"""
|
|
res = self.sql("select issingle from `tabDocType` where name='{}'".format(doctype))
|
|
if not res:
|
|
raise Exception('Wrong doctype {0} in updatedb'.format(doctype))
|
|
|
|
if not res[0][0]:
|
|
db_table = PostgresTable(doctype, meta)
|
|
db_table.validate()
|
|
|
|
self.commit()
|
|
db_table.sync()
|
|
self.begin()
|
|
|
|
@staticmethod
|
|
def get_on_duplicate_update(key='name'):
|
|
if isinstance(key, list):
|
|
key = '", "'.join(key)
|
|
return 'ON CONFLICT ("{key}") DO UPDATE SET '.format(
|
|
key=key
|
|
)
|
|
|
|
def check_transaction_status(self, query):
|
|
pass
|
|
|
|
def has_index(self, table_name, index_name):
|
|
return self.sql("""SELECT 1 FROM pg_indexes WHERE tablename='{table_name}'
|
|
and indexname='{index_name}' limit 1""".format(table_name=table_name, index_name=index_name))
|
|
|
|
def add_index(self, doctype, fields, index_name=None):
|
|
"""Creates an index with given fields if not already created.
|
|
Index name will be `fieldname1_fieldname2_index`"""
|
|
index_name = index_name or self.get_index_name(fields)
|
|
table_name = 'tab' + doctype
|
|
|
|
self.commit()
|
|
self.sql("""CREATE INDEX IF NOT EXISTS "{}" ON `{}`("{}")""".format(index_name, table_name, '", "'.join(fields)))
|
|
|
|
def add_unique(self, doctype, fields, constraint_name=None):
|
|
if isinstance(fields, string_types):
|
|
fields = [fields]
|
|
if not constraint_name:
|
|
constraint_name = "unique_" + "_".join(fields)
|
|
|
|
if not self.sql("""
|
|
SELECT CONSTRAINT_NAME
|
|
FROM information_schema.TABLE_CONSTRAINTS
|
|
WHERE table_name=%s
|
|
AND constraint_type='UNIQUE'
|
|
AND CONSTRAINT_NAME=%s""",
|
|
('tab' + doctype, constraint_name)):
|
|
self.commit()
|
|
self.sql("""ALTER TABLE `tab%s`
|
|
ADD CONSTRAINT %s UNIQUE (%s)""" % (doctype, constraint_name, ", ".join(fields)))
|
|
|
|
def get_table_columns_description(self, table_name):
|
|
"""Returns list of column and its description"""
|
|
# pylint: disable=W1401
|
|
return self.sql('''
|
|
SELECT a.column_name AS name,
|
|
CASE a.data_type
|
|
WHEN 'character varying' THEN CONCAT('varchar(', a.character_maximum_length ,')')
|
|
WHEN 'timestamp without TIME zone' THEN 'timestamp'
|
|
ELSE a.data_type
|
|
END AS type,
|
|
COUNT(b.indexdef) AS Index,
|
|
COALESCE(a.column_default, NULL) AS default,
|
|
BOOL_OR(b.unique) AS unique
|
|
FROM information_schema.columns a
|
|
LEFT JOIN
|
|
(SELECT indexdef, tablename, indexdef LIKE '%UNIQUE INDEX%' AS unique
|
|
FROM pg_indexes
|
|
WHERE tablename='{table_name}') b
|
|
ON SUBSTRING(b.indexdef, '\(.*\)') LIKE CONCAT('%', a.column_name, '%')
|
|
WHERE a.table_name = '{table_name}'
|
|
GROUP BY a.column_name, a.data_type, a.column_default, a.character_maximum_length;'''
|
|
.format(table_name=table_name), as_dict=1)
|
|
|
|
def get_database_list(self, target):
|
|
return [d[0] for d in self.sql("SELECT datname FROM pg_database;")]
|
|
|
|
def modify_query(query):
|
|
""""Modifies query according to the requirements of postgres"""
|
|
# replace ` with " for definitions
|
|
query = query.replace('`', '"')
|
|
query = replace_locate_with_strpos(query)
|
|
# select from requires ""
|
|
if re.search('from tab', query, flags=re.IGNORECASE):
|
|
query = re.sub('from tab([a-zA-Z]*)', r'from "tab\1"', query, flags=re.IGNORECASE)
|
|
|
|
return query
|
|
|
|
def replace_locate_with_strpos(query):
|
|
# strpos is the locate equivalent in postgres
|
|
if re.search(r'locate\(', query, flags=re.IGNORECASE):
|
|
query = re.sub(r'locate\(([^,]+),([^)]+)\)', r'strpos(\2, \1)', query, flags=re.IGNORECASE)
|
|
return query |