Dec 19, 2019

mysql joins group by

Find all duplicate records with (name, state, city) same

select v.venue_id, v.name, v.city, v.state from raw_data v 
join
(select  COUNT(concat(name,state,city)),name,state,city from raw_data
where source_id = 2 group by name,state,city
having COUNT(concat(name,state,city)) > 1) a
on 
v.name = a.name and v.state = a.state and v.city = a.city and v.source_id = 2
order by v.name,v.state,v.city

Dec 16, 2019

Pycharm Save Actions plugin

Pycharm Save Actions plugin

PyCharm Google Doc Strings

PyCharm -> Tools -> Python Integrated Tools -> Doc Strings -> Doc String Format -> Google

Nov 18, 2019

Nohup is not writing log to output file

nohup python long_running_task.py &

Using '-u' with 'nohup' worked for me. Everything will be saved in "nohup.out " file

nohup python -u long_running_task.py &

Nov 16, 2019

Python record linkage


Python fuzzywuzzy

#Ref: https://www.datacamp.com/community/tutorials/fuzzy-string-python

#pip install fuzzywuzzy
#pip install python-Levenshtein

from fuzzywuzzy import fuzz
from fuzzywuzzy import process

print('-------------string matching')
Str1 = "Apple Inc."
Str2 = "apple Inc"
Ratio = fuzz.ratio(Str1.lower(),Str2.lower())
print(Ratio)            #95

print('-------------substring matching')
Str1 = "Los Angeles Lakers"
Str2 = "Lakers"
Ratio = fuzz.ratio(Str1.lower(),Str2.lower())
Partial_Ratio = fuzz.partial_ratio(Str1.lower(),Str2.lower())
print(Ratio)            #50
print(Partial_Ratio)    #100

print('-------------string different order match - same length')
#They tokenize the strings and preprocess them by turning them to lower case and getting rid of punctuation
Str1 = "united states v. nixon"
Str2 = "Nixon v. United States"
Ratio = fuzz.ratio(Str1.lower(),Str2.lower())
Partial_Ratio = fuzz.partial_ratio(Str1.lower(),Str2.lower())
Token_Sort_Ratio = fuzz.token_sort_ratio(Str1,Str2)
print(Ratio)            #59
print(Partial_Ratio)    #74
print(Token_Sort_Ratio) #100

print('-------------string different order match - different length')
Str1 = "The supreme court case of Nixon vs The United States"
Str2 = "Nixon v. United States"
Ratio = fuzz.ratio(Str1.lower(),Str2.lower())
Partial_Ratio = fuzz.partial_ratio(Str1.lower(),Str2.lower())
Token_Sort_Ratio = fuzz.token_sort_ratio(Str1,Str2)
Token_Set_Ratio = fuzz.token_set_ratio(Str1,Str2)
print(Ratio)            #57
print(Partial_Ratio)    #77
print(Token_Sort_Ratio) #58 
print(Token_Set_Ratio)  #95

print('-------------search string in a list of strings with score/ratio')
str2Match = "apple inc"
strOptions = ["Apple Inc.","apple park","apple incorporated","iphone"]
Ratios = process.extract(str2Match,strOptions)
print(Ratios)
#[('Apple Inc.', 100), ('apple incorporated', 90), ('apple park', 67), ('iphone', 40)]
# You can also select the string with the highest matching percentage
highest = process.extractOne(str2Match,strOptions)
print(highest)
#('Apple Inc.', 100)


Nov 14, 2019

ElasticSearch


Python Black


Newman tool Postman


Newman
  • Newman is a command-line collection runner for Postman
  • It allows you to effortlessly run and test a Postman collection directly from the command-line.

Nov 11, 2019

Python AsyncIO

Ref: https://realpython.com/async-io-python/
  • Threading Vs Multi-Processing
    • Threading is better for I/O based tasks
    • Multi-Processing is better for CPU based tasks
    • What’s important to know about threading is that it’s better for IO-bound tasks.
  • Concurrency Vs Parallelism
    • Concurrency is when two tasks can start, run, and complete in overlapping time periods. e.g., Threading, AsyncIO
    • Parallelism is when tasks literally run at the same time, eg. multi-processing. 
  • While a CPU-bound task is characterised by the computer’s cores continually working hard from start to finish, an IO-bound job is dominated by a lot of waiting on input/output to complete.
  • Preemptive multitasking Vs Cooperative multitasking
    • OS preempts a thread forcing it to give up the use of CPU (E.g., Threading)
    • Cooperative multitasking on the other hand, the running process voluntarily gives up the CPU to other processes E.g., (AsyncIO)
  • Coroutine Vs Method/Function/Subroutine
    • Method or Function returns a value and don't remember the state between invocations
    • A coroutine is a special function that can give up control to its caller without losing its state
  • Coroutine Vs Generator
    • Generator yield back value to invoker
    • Coroutine yields control to another coroutine and can resume execution from point it gave the control
    • A generator can't accept arguments once it is started where as a coroutine can accept arguments once it started
  • AsyncIO is a single-threaded, single-process design: it uses cooperative multitasking
  • AsyncIO gives a feeling of concurrency despite using a single thread in a single process
  • Coroutines (a central feature of async IO) can be scheduled concurrently, but they are not inherently concurrent.
  • Asynchronous routines are able to “pause” while waiting on their ultimate result and let other routines run in the meantime.


import asyncio
import time

async def count_func():
    print("Line One")
    await asyncio.sleep(1) # await non-blocking call
    print("Line Two")


async def main():
    await asyncio.gather(count_func(), count_func(), count_func())


if __name__ == "__main__":
    t1 = time.time()
    asyncio.run(main())
    elapsed = time.time() - t1

    #This is supposed to take more than 3 secs
    print(f"{__file__} executed in {elapsed:0.2f} seconds.")


Output:
#####
Line One
Line One
Line One
Line Two
Line Two
Line Two
main.py executed in 1.10 seconds.



Python Disadvantages

Python Disadvantages
  • Its an interpreted language, not as fast as a compiled language
  • Slower than C & C++. Since Python is a high level language unlike C, C++, its not close to hardware 
  • Not good for Gaming, Mobile dev, Desktop UI applications
  • Not good for memory intensive, due to flexibility of data types, python memory consumption is high
  • Python's database access layer is found to be bit underdeveloped and primitive (JDBC/ODBC)
  • GIL
    • Global interpreter lock: can’t run more than one thread using in one interpreter 
    • Creating, managing and tearing down processes (multi-processing, processes are heavier than threads) is more expensive than doing the same for threads. Furthermore, inter-process communication is relatively slower than inter-thread communication. 
    • Both these drawbacks may not make Python a practical technology choice for super-critical or time-sensitive use-cases.
    • Python community are working to remove the GIL from CPython. One such attempt is known as the Gilectomy.
    • GIL exists only in the original Python implementation that is CPython.
    • Python has multiple interpreter implementations. CPython, Jython, IronPython and PyPy, written in C, Java, C# and Python respectively, are the most popular ones.


Python fstrings

elem = 10
elem_index = 5

def to_lowercase(st):
return st.lower()

#Formatted string literals (f-strings)
#The idea behind f-strings is to make string interpolation simpler.
#f-strings are expressions evaluated at runtime rather than constant values
print(f'Index of element {elem} is {elem_index}')

#call functions from f-strings
print(f'String to lowercase: {to_lowercase("TEST_STRING")}')
print(f'String to lowercase: {"TEST_STRING".lower()}')

#Format - used in previous versions of python
print('Index of element {} is {}'.format(elem, elem_index))
elapsed = 23.5678
print(f"{__file__} executed in {elapsed:0.2f} seconds.")


Output:
Index of element 10 is 5 String to lowercase: test_string String to lowercase: test_string Index of element 10 is 5
main.py executed in 23.57 seconds.


Sep 25, 2019

Python Prime number

import math

def is_prime(n):
    if n<2 == 0:
        return False

    sqrt_n = int(math.floor(math.sqrt(n)))
    for i in range(2, sqrt_n + 1): 
        if n % i == 0:
            return False
    return True

foo = [2, 18, 9, 22, 17, 24, 8, 12, 27]
print("All list :", list(foo))
print("Prime list :", list(filter(is_prime, foo)))

Output:
All list : [2, 18, 9, 22, 17, 24, 8, 12, 27]
Prime list : [2, 17]


Sep 24, 2019

Python concurrent.futures ProcessPoolExecutor

"""
The ProcessPoolExecutor class is an Executor subclass that uses a pool of processes to execute calls asynchronously. 

ProcessPoolExecutor uses the multiprocessing module
"""

from concurrent.futures import ProcessPoolExecutor
import math
import multiprocessing
import os
import sys
import time

PRIMES = [
    112272535095293,
    112582705942171,
    112272535095293,
    115280095190773,
    115797848077099,
    109972689928541]

def is_prime(n):
    if n == 2:

        return True

    if n % 2 == 0:
        return False

    sqrt_n = int(math.floor(math.sqrt(n)))
    for i in range(3, sqrt_n + 1, 2):
        if n % i == 0:
            return False
    return True

def main():
    print('No of CPUs/Processors: {}' . format(multiprocessing.cpu_count()))
    a = time.time()
    #default max_workers is number of processors on the machine
    with ProcessPoolExecutor() as executor:
        for number, prime in zip(PRIMES, executor.map(is_prime, PRIMES)):
            print('%d is prime: %s' % (number, prime))
    b = time.time()
    print('Time taken: {:.2f} secs'.format(b-a))

if __name__ == '__main__':
    main()


Output:
No of CPUs/Processors: 4
112272535095293 is prime: True
112582705942171 is prime: True
112272535095293 is prime: True
115280095190773 is prime: True
115797848077099 is prime: True
109972689928541 is prime: False
Time taken: 12.67 secs


Python concurrent.futures ThreadPoolExecutor as_completed

import urllib.request 
from concurrent.futures import ThreadPoolExecutor, as_completed

URLS = ['https://www.google.com', 
               'http://www.cnn.com/',
               'http://europe.wsj.com/', 
               'http://www.bbc.co.uk/', 
               'http://abc.abc.com'   #invalid
             ]

def load_url(url, timeout):
  with urllib.request.urlopen(url, timeout=timeout) as conn:
    txt = conn.read()
    return txt

with ThreadPoolExecutor(max_workers = 5) as executor:
  #Forming Key-Value pairs
  future_to_url = {executor.submit(load_url, url, 50): url for url in URLS}
  print(future_to_url)
  print('----')
  for future in as_completed(future_to_url):
    url = future_to_url[future]
    try:
      data = future.result()
      print('%s length is %d' % (url, len(data)))
    except Exception as e:
      print('Error in URL: %s is %s' % (url, e))


Output:
{<Future at 0x7f40d262d2d0 state=running>: 'https://www.google.com', <Future at 0x7f40cadc4ad0 state=running>: 'http://www.cnn.com/', <Future at 0x7f40cadcd710 state=running>: 'http://europe.wsj.com/', <Future at 0x7f40cadcd410 state=running>: 'http://www.bbc.co.uk/', <Future at 0x7f40cade0a10 state=running>: 'http://abc.abc.com'}
----
Error in URL: http://abc.abc.com is <urlopen error [Errno -2] Name or servicenot known>
https://www.google.com length is 12571
http://www.cnn.com/ length is 1134562
http://europe.wsj.com/ length is 1006417
http://www.bbc.co.uk/ length is 311008

Python sort by val

myd = { "Peter": 40, "John": 2, "Bob": 1, "Danny": 3, } #sort by val s = sorted(myd.items(), key= lambda x:x[1]) print(s) Output: [('Bob', 1), ('John', 2), ('Danny', 3), ('Peter', 40)]

Python Sorted 2 Vs 3 versions

employees = {1000: {'name': 'Sahasra','country': 'India', 'age': 25}, \
   1001: {'name': 'Peter','country': 'US', 'age': 21}, \
   1002: {'name': 'John','country': 'US', 'age': 36}, \
   1003: {'name': 'Sarayu','country': 'India', 'age': 30},\
   1004: {'name': 'Akio','country': 'Japan', 'age': 60}, \
   1005: {'name': 'Anand','country': 'India', 'age': 50}, \
   1006: {'name': 'Vidya','country': 'India', 'age': 32}, \
   1007: {'name': 'Salma','country': 'Bangladesh', 'age': 23},}

# Works in Python 2.7 only
ss = sorted(employees.items(), key=lambda(x, y): y['age'])
print(ss)

# Works in Python 2.7 and 3.7
# Using parentheses to unpack the arguments in a lambda is not allowed in ss ss = sorted(employees.items(), key=lambda x: x[1]['age'])
print(ss)


Output:

[(1001, {'country': 'US', 'age': 21, 'name': 'Peter'}), (1007, {'country': 'Bangladesh', 'age': 23, 'name': 'Salma'}), (1000, {'country': 'India', 'age': 25, 'name': 'Sahasra'}), (1003, {'country': 'India', 'age': 30, 'name': 'Sarayu'}), (1006, {'country': 'India', 'age': 32, 'name': 'Vidya'}), (1002, {'country': 'US', 'age': 36, 'name': 'John'}), (1005, {'country': 'India', 'age': 50, 'name': 'Anand'}), (1004, {'country': 'Japan', 'age': 60, 'name': 'Akio'})]

[(1001, {'country': 'US', 'age': 21, 'name': 'Peter'}), (1007, {'country': 'Bangladesh', 'age': 23, 'name': 'Salma'}), (1000, {'country': 'India', 'age': 25, 'name': 'Sahasra'}), (1003, {'country': 'India', 'age': 30, 'name': 'Sarayu'}), (1006, {'country': 'India', 'age': 32, 'name': 'Vidya'}), (1002, {'country': 'US', 'age': 36, 'name': 'John'}), (1005, {'country': 'India', 'age': 50, 'name': 'Anand'}), (1004, {'country': 'Japan', 'age': 60, 'name': 'Akio'})]

Sep 19, 2019

Python ThreadPoolExecutor submit

from concurrent.futures import ThreadPoolExecutor import threading def task(n): print("Processing {} - {}".format(n, threading.current_thread())) def main(): print("Starting ThreadPoolExecutor") with ThreadPoolExecutor(max_workers=3) as executor: future = executor.submit(task, (2)) future = executor.submit(task, (3)) future = executor.submit(task, (4)) print("All tasks complete") if __name__ == '__main__': main()

Output:
Starting ThreadPoolExecutor Processing 2 - <Thread(ThreadPoolExecutor-0_0, started daemon 140052642395904)> Processing 3 - <Thread(ThreadPoolExecutor-0_1, started daemon 140052634003200)> Processing 4 - <Thread(ThreadPoolExecutor-0_2, started daemon 140052625610496)> All tasks complete


Python ThreadPoolExecutor, map

import urllib.request 
from concurrent.futures import ThreadPoolExecutor
import threading

urls = [
  'http://www.python.org', 
  'http://www.python.org/about/',
  'http://www.onlamp.com/pub/a/python/2003/04/17/metaclasses.html',
  'http://www.python.org/doc/',
  'http://www.python.org/download/',
  'http://www.python.org/getit/',
  'http://www.python.org/community/',
  'https://wiki.python.org/moin/',
]

def fun(url):
  print(url, threading.current_thread())
  r = urllib.request.urlopen(url)
  return r

# make the Pool of workers
pool = ThreadPoolExecutor(4) 

results = pool.map(fun, urls)
real_results = list(results)
print('----')
print(real_results)


Output:
http://www.python.org <Thread(ThreadPoolExecutor-0_0, started daemon 139989036611328)>
http://www.python.org/about/ <Thread(ThreadPoolExecutor-0_1, started daemon 139988956083968)>
http://www.onlamp.com/pub/a/python/2003/04/17/metaclasses.html <Thread(ThreadPoolExecutor-0_2, started daemon 139988947691264)>
http://www.python.org/doc/ <Thread(ThreadPoolExecutor-0_3, started daemon 139988939298560)>
http://www.python.org/download/ <Thread(ThreadPoolExecutor-0_1, started daemon 139988956083968)>
http://www.python.org/getit/ <Thread(ThreadPoolExecutor-0_3, started daemon 139988939298560)>
http://www.python.org/community/ <Thread(ThreadPoolExecutor-0_0, started daemon 139989036611328)>
https://wiki.python.org/moin/ <Thread(ThreadPoolExecutor-0_3, started daemon 139988939298560)>
----
[<http.client.HTTPResponse object at 0x7f51be3d7910>, <http.client.HTTPResponse object at 0x7f51be3c5f90>, <http.client.HTTPResponse object at 0x7f51be3d73d0>, <http.client.HTTPResponse object at 0x7f51be3d77d0>, <http.client.HTTPResponse object at 0x7f51be3b3b90>, <http.client.HTTPResponse object at 0x7f51be3c5610>, <http.client.HTTPResponse object at 0x7f51be3e1c90>, <http.client.HTTPResponse object at 0x7f51be3d7250>]

Python threads using Queue

#Ref:
#https://stackoverflow.com/questions/47900922/split-list-into-n-lists-and-assign-each-list-to-a-worker-in-multithreading
#https://pymotw.com/2/Queue/

#The Queue module provides a FIFO implementation suitable for multi-threaded programming. 
#It can be used to pass messages or other data between producer and consumer threads safely. 
#Locking is handled for the caller, so it is simple to have as many threads as you want working with the same Queue instance. 
#A Queue’s size (number of elements) may be restricted to throttle memory usage or processing.

#### 3 types of queues
# Basic FIFO Queue
# LIFO Queue
# Priority Queue

from queue import Queue, LifoQueue
from threading import Thread, current_thread
from time import sleep
first_names = ['Steve','Jane','Sara','Mary','Jack','tara','bobby']

q = Queue() #FIFO
lq = LifoQueue() #LifoQueue

num_threads = 3

def do_stuff(q):
    while True:
        print(q.get(), current_thread())
        sleep(1)
        q.task_done()

if __name__ == '__main__':
    print('------ FIFO - Basic ------')

    for x in first_names:
        q.put(x)
    
    for i in range(num_threads):
        worker = Thread(target=do_stuff, args=(q,))
        worker.start()

    q.join()
    
    print('------ LIFO - reverse order -----')

    for x in first_names:
        lq.put(x)
    
    for i in range(num_threads):
        worker = Thread(target=do_stuff, args=(lq,))
        worker.start()

    lq.join()



Output:

------ FIFO - Basic ------
Steve <Thread(Thread-1, started 140202362271488)>
Jane <Thread(Thread-2, started 140202353878784)>
Sara <Thread(Thread-3, started 140202345486080)>
Mary <Thread(Thread-1, started 140202362271488)>
Jack <Thread(Thread-2, started 140202353878784)>
tara <Thread(Thread-3, started 140202345486080)>
bobby <Thread(Thread-1, started 140202362271488)>
------ LIFO - reverse order -----
bobby <Thread(Thread-4, started 140202337093376)>
tara <Thread(Thread-5, started 140202328700672)>
Jack <Thread(Thread-6, started 140202320307968)>
Mary <Thread(Thread-4, started 140202337093376)>
Sara <Thread(Thread-5, started 140202328700672)>
Jane <Thread(Thread-6, started 140202320307968)>
Steve <Thread(Thread-4, started 140202337093376)>


Python multiple threads - how to join

from threading import Thread, active_count, current_thread
import time

def fun(val):
  for _ in range(5):
    print(val, current_thread())
    time.sleep(3)

threads = []
for i in range(1, 11):
   t = Thread(target=fun, args=(i*i,))
   threads.append(t)
   t.start()
   print("Current Threads count: %i." % active_count())

#Join threads
for t in threads:
    t.join()

print('bye')


Output:
######
1 <Thread(Thread-1, started 140645849839360)>
Current Threads count: 2.
4 <Thread(Thread-2, started 140645841446656)>
Current Threads count: 3.
9 <Thread(Thread-3, started 140645833053952)>
Current Threads count: 4.
16 <Thread(Thread-4, started 140645616318208)>
Current Threads count: 5.
25 <Thread(Thread-5, started 140645607925504)>
Current Threads count: 6.
36 <Thread(Thread-6, started 140645599532800)>
Current Threads count: 7.
49 <Thread(Thread-7, started 140645591140096)>
Current Threads count: 8.
64 <Thread(Thread-8, started 140645582747392)>
Current Threads count: 9.
81 <Thread(Thread-9, started 140645574354688)>
Current Threads count: 10.
100 <Thread(Thread-10, started 140645565961984)>
Current Threads count: 11.
4 <Thread(Thread-2, started 140645841446656)>
9 <Thread(Thread-3, started 140645833053952)>
16 <Thread(Thread-4, started 140645616318208)>
1 <Thread(Thread-1, started 140645849839360)>
100 <Thread(Thread-10, started 140645565961984)>
25 <Thread(Thread-5, started 140645607925504)>
36 <Thread(Thread-6, started 140645599532800)>
64 <Thread(Thread-8, started 140645582747392)>
49 <Thread(Thread-7, started 140645591140096)>
81 <Thread(Thread-9, started 140645574354688)>
4 <Thread(Thread-2, started 140645841446656)>
9 <Thread(Thread-3, started 140645833053952)>
16 <Thread(Thread-4, started 140645616318208)>
1 <Thread(Thread-1, started 140645849839360)>
100 <Thread(Thread-10, started 140645565961984)>
25 <Thread(Thread-5, started 140645607925504)>
36 <Thread(Thread-6, started 140645599532800)>
64 <Thread(Thread-8, started 140645582747392)>
81 <Thread(Thread-9, started 140645574354688)>
49 <Thread(Thread-7, started 140645591140096)>
4 <Thread(Thread-2, started 140645841446656)>
9 <Thread(Thread-3, started 140645833053952)>
16 <Thread(Thread-4, started 140645616318208)>
100 <Thread(Thread-10, started 140645565961984)>
1 <Thread(Thread-1, started 140645849839360)>
25 <Thread(Thread-5, started 140645607925504)>
36 <Thread(Thread-6, started 140645599532800)>
64 <Thread(Thread-8, started 140645582747392)>
81 <Thread(Thread-9, started 140645574354688)>
49 <Thread(Thread-7, started 140645591140096)>
4 <Thread(Thread-2, started 140645841446656)>
16 <Thread(Thread-4, started 140645616318208)>
9 <Thread(Thread-3, started 140645833053952)>
1 <Thread(Thread-1, started 140645849839360)>
100 <Thread(Thread-10, started 140645565961984)>
64 <Thread(Thread-8, started 140645582747392)>
25 <Thread(Thread-5, started 140645607925504)>
36 <Thread(Thread-6, started 140645599532800)>
81 <Thread(Thread-9, started 140645574354688)>
49 <Thread(Thread-7, started 140645591140096)>
bye

Python glob vs glob recursive - loop directory

import glob

p = glob.glob('*.py')
print(p)
print(len(p)) #17

#single star - all files in current dir
p = glob.glob('*', recursive=True)
print(p)
print(len(p)) #20

#double star - all folders and files recursively in current dir
p = glob.glob('**', recursive=True)
print(p)
print(len(p)) #22


"""
Output:
['timeit_test.py', 'args_kwargs.py', 'fibonacci.py', 'shallow_vs_deep_copy.py', 'inheritance_example.py', 'python_closure.py', 'super_test.py', 'date_example.py', 'contextlib_example.py', 're_compile_vs_match.py', 'iterator_example.py', 'str_repr_eval.py', 'generator_example.py', 'init_vs_call.py', 'main.py', 'filter_map_reduce.py', '_test_runner.py']
17

['timeit_test.py', 'args_kwargs.py', 'fibonacci.py', 'shallow_vs_deep_copy.py', 'inheritance_example.py', 'python_closure.py', 'super_test.py', 'utils', 'date_example.py', 'contextlib_example.py', 'test1.txt', 're_compile_vs_match.py', 'test.txt', 'iterator_example.py', 'str_repr_eval.py', 'generator_example.py','init_vs_call.py', 'main.py', 'filter_map_reduce.py', '_test_runner.py']
20

['timeit_test.py', 'args_kwargs.py', 'fibonacci.py', 'shallow_vs_deep_copy.py', 'inheritance_example.py', 'python_closure.py', 'super_test.py', 'utils', 'utils/__init__.py', 'utils/utils.py', 'utils/utils1', 'date_example.py', 'contextlib_example.py','test1.txt', 're_compile_vs_match.py', 'test.txt', 'iterator_example.py', 'str_repr_eval.py', 'generator_example.py', 'init_vs_call.py', 'main.py', 'filter_map_reduce.py', '_test_runner.py']
22
"""

python reverse vs reversed

"""
reverse() modifies the list itself, whereas
reversed() returns an iterator ready to traverse the list in reversed order.
"""

#string reverse (best way for string reverse using slicing)
s = 'string'
print(s[::-1]) #gnirts
print(s) #string

#string reversed
rs = reversed(s)
print(''.join(rs)) #gnirts
print(s) #string

#reverse list
l = [1,2,3]
l.reverse()
print(l) #[3,2,1]

#reversed list
ll = reversed(l)
print(ll) #<list_reverseiterator object at 0x7fa572312790>
print(list(ll)) #[1,2,3]


Sep 18, 2019

python timeit re vs compiled re

import re
import timeit

s = 'strings are strings'
compiled_regex = re.compile(r'(str)in(gs)')

def not_compiled_func():
   r = re.match(r'(str)in(gs)', s) 
   #print(r.group())

def compiled_func():
  r = compiled_regex.match(s)
  #print(r.group()) 

t1 = timeit.timeit(stmt=not_compiled_func, number=1000000)
print('%0.2f' % t1)  #5.65 secs
t2 = timeit.timeit(stmt=compiled_func, number=1000000)
print('%0.2f' % t2)  #1.73 secs



Sep 17, 2019

Python remove all occurrences a given element from the list

#Method1
#Use the same list (don't use another list)
lista = [10, 10, 20, 10, 30, 10, 40, 10, 50]
print(lista)
#o/p: [20, 30, 40, 50]

#Note:
#Every time when you remove item, list index changes

n = 10 #element to remove

print('------Method1-----')
i = 0
listlen = len(lista)
while(i < listlen):
  if lista[i] == n:
    lista.remove(n)
    listlen -= 1
    continue
  i += 1

print(lista)   #[20, 30, 40, 50]


#Method2
print('-----Method2-----')
#Remove element and copy to another list
lista = [10, 10, 20, 10, 30, 10, 40, 10, 50]
listb = list(filter(lambda x: x != 10, lista))
print(listb)   #[20, 30, 40, 50]
listb = [i for i in lista if i != 10]
print(listb)   #[20, 30, 40, 50]

Python Super method

class A:
  def __init__(self):
    print('A')

class AA(A):
  def __init__(self):
    print('AA')
    super().__init__() #No need to pass self, only args if required

a = A()
aa = AA()


class B:
  def __init__(self):
    print('B')

class BB(B):
  def __init__(self):
    print('BB')
    B.__init__(self) #need to pass self and args if required

b = B()
bb = BB()


Output:
A
AA
A
B
BB
B


python str vs repr vs eval

import datetime 

foo = 100
var = 'foo'

print(var) #foo

#with quotes
print(repr(var)) #'foo'

#evaluates any variables
print(eval(var)) #100

today = datetime.datetime.now()

# Prints readable format for date-time object
print (str(today)) #2019-09-17 11:54:13.675979

# prints the official format of date-time object
print (repr(today)) #datetime.datetime(2019, 9, 17, 11, 54, 13, 675979)

Python Shallow vs Deep Copy

"""
1. Copy by Reference
2. Shallow Copy
3. Deep Copy
"""

import copy

#Copy Ref
old_list = [[1, 2, 3], [4, 5, 6], [7, 8, 'a']]
new_list = old_list
new_list[2][2] = 9

print('----------')
print('ID of Old List:', id(old_list))
print('ID of New List:', id(new_list))
print('Copy Ref - Old List:', old_list)
print('Copy Ref - New List:', new_list)
print('----------')

#Output
#ID of Old List: 140672073880512
#ID of New List: 140672073880512
#Copy Ref - Old List: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
#Copy Ref - New List: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]


#Shallow Copy
##############
#A shallow copy creates a new object which stores the reference of the original elements.

old_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
new_list = copy.copy(old_list)

print("Shallow Copy Old list:", old_list)
print("Shallow Copy New list:", new_list)
print('----------')

#Output:
#Shallow Copy Old list: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
#Shallow Copy New list: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]

#Shallow Copy - append
######################
old_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
new_list = copy.copy(old_list)
new_list.append([10,11,12])
print("Shallow Copy add Old list:", old_list)
print("Shallow Copy add New list:", new_list)
print('----------')

#Output:
#Shallow Copy add Old list: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
#Shallow Copy add New list: [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]


#Shallow Copy - nested update
###########################
#Existing elements will get updated - since it has reference to original elements

old_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
new_list = copy.copy(old_list)
new_list[1][1] = 400
print("Shallow Copy nested Old list:", old_list)
print("Shallow Copy nested New list:", new_list)
print('----------')

#Output:
#Shallow Copy nested Old list: [[1, 2, 3], [4, 400, 6], [7, 8, 9]]
#Shallow Copy nested New list: [[1, 2, 3], [4, 400, 6], [7, 8, 9]]


#Deep Copy###########
#It makes complete copy of elements
old_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
new_list = copy.deepcopy(old_list)
new_list[1][1] = 400
print('ID of Old List:', id(old_list))
print('ID of New List:', id(new_list))

print("Deep Copy Old list:", old_list)
print("Deep Copy New list:", new_list)

#Output:
#ID of Old List: 140672073879792
#ID of New List: 140672073880992
#Deep Copy Old list: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
#Deep Copy New list: [[1, 2, 3], [4, 400, 6], [7, 8, 9]]



Python args kwargs

def argsTest(*args, **kwargs):
     print(args)
     print(kwargs)

print('-----------')
argsTest(10)
print('-----------')
argsTest(10, 20)
print('-----------')
argsTest(10, 20, 30, a=10, b=20)
print('-----------')

Output:
-----------
(10,)
{}
-----------
(10, 20)
{}
-----------
(10, 20, 30)
{'a': 10, 'b': 20}
-----------

Sep 16, 2019

python fibonacci


FibList = [0,1] 
  
def fibonacci_func(n): 
    if n<0: 
        print("Entered Incorrect input") 
    elif n<=len(FibList): 
        return FibList[n-1] 
    else: 
        temp_fib = fibonacci_func(n-1)+fibonacci_func(n-2) 
        FibList.append(temp_fib) 
        return temp_fib 


print('#######')
print(fibonacci_func(14))  #233


Sep 13, 2019

python contextlib vs context manager

import contextlib

class contextManagerExample:
  def __init__(self):
    print('inside __init__')
  def __enter__(self):
    print('inside __enter__')
    return 'returning from contextManagerExample enter'
  def __exit__(self, exc_type, exc_val, exc_tb):
    print('inside __exit__')

#No need to write __enter__, __exit__ separately
#yield instead of return
@contextlib.contextmanager
def context_lib_test():
  try:
    yield 'returning from context_lib_test enter'
  except Exception as e:
    raise

if __name__ == '__main__':
  print('-'*20)
  with contextManagerExample() as cm:
    print('inside ContextManagerExample scope')
    print(cm)
  print('-'*20)
  with context_lib_test() as cm:
    print('inside context_lib_test scope')
    print(cm)
  print('-'*20)



Output:
--------------------
inside __init__
inside __enter__
inside ContextManagerExample scope
returning from contextManagerExample enter
inside __exit__
--------------------
inside context_lib_test scope
returning from context_lib_test enter
--------------------



Jun 5, 2019

python context managers using contextlib

"""
# Using contextlib you don't have to explicitly write __enter__, __exit__
# yield instead of return
"""
import contextlib
import sys
import time

@contextlib.contextmanager
def context_manager_def_test():
    print('context_manager_def_test: ENTER')
    try:
        yield 'You are in with-block'
        print('context_manager_def_test: NORMAL EXIT')
    except Exception:
        print('context_manager_def_test: EXCEPTION EXIT', sys.exc_info())
        raise

print('*'*75)

with context_manager_def_test() as cm:
    print('Inside ContextManagerTest')
    print(cm)

print('*'*75)
time.sleep(1)

with context_manager_def_test() as cm:
    print('Inside ContextManagerTest')
    print(cm)
    raise ValueError('something is wrong')

print('*'*75)


"""
***************************************************************************
context_manager_def_test: ENTER
Inside ContextManagerTest
You are in with-block
context_manager_def_test: NORMAL EXIT
***************************************************************************
context_manager_def_test: ENTER
Inside ContextManagerTest
You are in with-block
context_manager_def_test: EXCEPTION EXIT (<class 'ValueError'>, ValueError('something is wrong'), <traceback object at 0x1023ed0c8>)
Traceback (most recent call last):
  File "/Users/prabhathkota/Workspace/prabhath/personal/Python_Scripts/context_managers/contextlib_example.py", line 31, in <module>
    raise ValueError('something is wrong')
ValueError: something is wrong
***************************************************************************
"""

Python context manager with exceptions

###################
# __enter__
# __enter__ is called before executing with-statement body
# __exit__
# __exit__ called after with-statement body
# File opening is context managers
###################


class ContextManagerTest:
    def __init__(self):
        print('Inside __init__')

    def __enter__(self):
        print('Inside __enter__')
        return 'returning ... Inside with block'
        # return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        if exc_type is None:
            print('Inside __exit__ without exception')
        else:
            print('Inside __exit__ with exception ({} - {} - {})' .format(exc_type, exc_val, exc_tb))


with ContextManagerTest() as cm:
    print('Inside ContextManagerTest')
    print(cm)
    raise ValueError('something is wrong')


"""
Traceback (most recent call last):
Inside __enter__
  File "...../Python_Scripts/context_managers/context_manager_with_exception.py", line 30, in <module>
Inside ContextManagerTest
    raise ValueError('something is wrong')
returning ... Inside with block
ValueError: something is wrong
Inside __exit__ with exception (<class 'ValueError'> - something is wrong - <traceback object at 0x1034ba608>)

"""

Python context manager

###################
# __enter__
# __enter__ is called before executing with-statement body
# __exit__
# __exit__ called after with-statement body
# File opening is context managers
###################


class ContextManagerTest:
    def __init__(self):
        print('Inside __init__')

    def __enter__(self):
        print('Inside __enter__')
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        if exc_type is None:
            print('Inside __exit__ without exception')
        else:
            print('Inside __exit__ with exception ({} - {} - {})'.format(exc_type, exc_val, exc_tb))
        return


with ContextManagerTest() as cm:
    print('Inside ContextManagerTest')
    print(cm)


"""
Inside __init__
Inside __enter__
Inside ContextManagerTest
<__main__.ContextManagerTest object at 0x10a920160>
Inside __exit__ without exception
"""

Jun 2, 2019

Python decorator to find out execution time taken by a function

import functools
import time

def timer(f): # without functools.wraps
    def timer_wrap(*args, **kwargs):
        """timer_wrap documentation """
        print('inside timer_wrap decorator')
        start_time = time.time()
        f(*args, **kwargs)
        end_time = time.time()
        print('Total Time Taken by function %s is : %4f secs' % (f.__name__, end_time - start_time))
        # f.__name__ gives function name
    return timer_wrap

def timer_wrap_with_functools(f): # with functools.wraps
    @functools.wraps(f)
    def timer_wrap(*args, **kwargs):
        """timer_wrap_with_functools documentation """
        print('inside timer_wrap_with_functools decorator')
        start_time = time.time()
        f(*args, **kwargs)
        end_time = time.time()
        print('Total Time Taken by function %s is : %4f secs' % (f.__name__, end_time - start_time))
    return timer_wrap


@timer
def test_timer_func(num_times):
    """test_timer_func documentation """
    total_sum = 0
    for _ in range(num_times):
        total_sum += sum([i ** 2 for i in range(1000)])
    print('Total Sum: %f ' % total_sum)

@timer_wrap_with_functools
def test_timer_func_functools(num_times):
    """test_timer_func_functools documentation """
    total_sum = 0
    for _ in range(num_times):
        total_sum += sum([i ** 2 for i in range(1000)])
    print('Total Sum: %f ' % total_sum)


if __name__ == '__main__':
    print('------------------------------------')
    test_timer_func(200)
    print(test_timer_func.__name__)  #gives wrapper name
    print(test_timer_func.__doc__)   #gives wrapper name
    print('------------------------------------')
    test_timer_func_functools(200)
    print(test_timer_func_functools.__name__) #gives function name
    print(test_timer_func_functools.__doc__)  #gives function name
    print('------------------------------------')


# Output:
------------------------------------
inside timer_wrap decorator
Total Sum: 66566700000.000000
Total Time Taken by function test_timer_func is : 0.205079 secs
timer_wrap
timer_wrap documentation
------------------------------------
inside timer_wrap_with_functools decorator
Total Sum: 66566700000.000000
Total Time Taken by function test_timer_func_functools is : 0.188637 secs
test_timer_func_functools
test_timer_func_functools documentation
------------------------------------




Python Decorator functools.wrap

######################################################
# Decorators
# use of functools.wraps
# The @functools.wraps decorator uses the function functools.update_wrapper() to update special attributes
# like __name__ and __doc__ that are used in the introspection.
######################################################
import functools


def decorator1(f):
    print('inside decorator1')

    def wrap(*args, **kwargs):
        f(*args, **kwargs)
    return wrap


@decorator1
def test_decorator1_func():
    """ test_decorator1_func documentation """
    print('inside test_decorator1_func')


def decorator2(f):
    print('inside decorator2')

    def wrap(*args, **kwargs):
        f(*args, **kwargs)
    wrap.__name__ = f.__name__
    wrap.__doc__ = f.__doc__
    return wrap


@decorator2
def test_decorator2_func():
    """ test_decorator2_func documentation """
    print('inside test_decorator2_func')


def decorator3(f):
    print('inside decorator3')

    @functools.wraps(f)
    def wrap(*args, **kwargs):
        f(*args, **kwargs)
    return wrap


@decorator3
def test_decorator3_func():
    """ test_decorator3_func documentation """
    print('inside test_decorator3_func')


if __name__ == '__main__':
    print('------------------------------------')
    # print(help(test_decorator1_func))
    print(test_decorator1_func.__name__)     # wrap
    print(test_decorator1_func.__doc__)      # None
    print(test_decorator1_func.__closure__)  # (<cell at 0x1064d0b28: function object at 0x1064ebe18>,)
    print('------------------------------------')
    # Using functools
    # print(help(test_decorator2_func))
    print(test_decorator2_func.__name__)     # test_decorator2_func
    print(test_decorator2_func.__doc__)      # test_decorator2_func
    print(test_decorator2_func.__closure__)  # (<cell at 0x1064d0b28: function object at 0x10654bc80>,)
    print('------------------------------------')
    # Using functools - this will achieve same as test_decorator2_func
    # print(help(test_decorator3_func))
    print(test_decorator3_func.__name__)     # test_decorator3_func
    print(test_decorator3_func.__doc__)      # test_decorator3_func documentation
    print(test_decorator3_func.__closure__)  # (<cell at 0x10344df18: function object at 0x1034e0d08>,)
    print('------------------------------------')

May 30, 2019

Python __init__ vs __call__

######################################
# __init__ vs __call__
# x = Foo(1, 2, 3) # __init__
# x = Foo()
# x(1, 2, 3) # __call__
######################################


class Foo:
    def __init__(self, a, b, c):
        self.a = a
        self.b = b
        self.c = c
        print('In Foo __init__ {}, {}, {}'.format(self.a, self.b, self.c))


class Bar:
    def __init__(self, a, b, c):
        self.a = a
        self.b = b
        self.c = c

    def __call__(self):
        print('In Bar __call__ {}, {}, {}' .format(self.a, self.b, self.c))


if __name__ == '__main__':
    print('--------------------------------------------------')
    f = Foo(100, 200, 300)
    print(f.a)
    #f()    #'Foo' object is not callable
    print('--------------------------------------------------')
    b = Bar(10, 20, 30)
    print(b.a)
    b()
    print('--------------------------------------------------')


"""
Output:

--------------------------------------------------
In Foo __init__ 100, 200, 300
100
--------------------------------------------------
10
In Bar __call__ 10, 20, 30
--------------------------------------------------
"""

Python Decorators

#########################
# A decorator is a design pattern in Python that allows a user to add new functionality to an existing object without modifying its structure.
# Decorators are usually called before the definition of a function you want to decorate.
#########################


# Decorated function
def escape_unicode(f):
    def wrap(*args, **kwargs):
        text = f(*args, **kwargs)
        return ascii(text)
    return wrap


def display_text(text):
    return ascii(text)


@escape_unicode
def display_text_ascii(text):
    return text


if __name__ == '__main__':
    # print('à°®ీà°°ు à°Žà°²ా ఉన్à°¨ాà°°ు?')
    # print(ascii('à°®ీà°°ు à°Žà°²ా ఉన్à°¨ాà°°ు?'))
    print(display_text('à°®ీà°°ు à°Žà°²ా ఉన్à°¨ాà°°ు?'))
    print(display_text_ascii('à°®ీà°°ు à°Žà°²ా ఉన్à°¨ాà°°ు?'))


"""
Output:

'\u0c2e\u0c40\u0c30\u0c41 \u0c0e\u0c32\u0c3e \u0c09\u0c28\u0c4d\u0c28\u0c3e\u0c30\u0c41?'
'\u0c2e\u0c40\u0c30\u0c41 \u0c0e\u0c32\u0c3e \u0c09\u0c28\u0c4d\u0c28\u0c3e\u0c30\u0c41?'
"""


Apr 3, 2019

Docker for Beginner


Docker Vs Virtual Machine
A container runs natively on Linux and shares the kernel of the host machine with other containers. It runs a discrete process, taking no more memory than any other executable, making it lightweight.

By contrast, a virtual machine (VM) runs a full-blown “guest” operating system with virtual access to host resources through a hypervisor. In general, VMs provide an environment with more resources than most applications need.

Docker commands:
docker info   
docker --version
docker-compose —version
docker-machine —version

#Execute Docker image
docker run hello-world

#list of running running containers
docker ps

#list docker images
docker image ls 

#list Docker containers
docker container ls
docker container ls --all

docker images
docker ps -l


You create a Dockerfle
###################
From this Dockerfile, you will create/build an image
docker build  --tag=friendlyhello  .

When you run a docker image, it will create a container for you (when u kill the running image, the running container is gone)

#Mapping container port to host/local_machine port
docker run -p 4000:80 friendlyhello
#4000 is the host port (locally) - http://localhost:4000/   (thats why we use 4000 port locally)
#80 is the container port

docker run -d -p 4000:80 friendlyhello   #run as a background daemon
docker run -d -p 4000:80 prabhathkota/test-docker:tag1

docker container ls
docker container stop 1fa4ab2cf395

docker stats <container_id>
docker logs <container_id>
docker cp <container_id>:/path/to/useful/file /local-path

Get inside the container:
####################
docker exec -it <containerid> bash


Docker hub - is like GitHub repository (to share)
####################################
Share your image
docker login
docker tag friendlyhello prabhathkota/test-docker:tag1
docker image ls

Remove image
#############
docker rmi -f 8b810fbdcf2d   #(forcefully remove image)

Publish the image:
##############
docker push prabhathkota/test-docker:tag1

Run image from remote repository
##########################
docker run -p 4000:80 prabhathkota/test-docker:tag1

docker build -t friendlyhello .  # Create image using this directory's Dockerfile

Run local repository
################
docker run -p 4000:80 friendlyhello  # Run "friendlyname" mapping port 4000 to 80
docker run -d -p 4000:80 friendlyhello         # Same thing, but in detached mode

Docker commands:
################
docker container ls                                # List all running containers
docker container ls -a             # List all containers, even those not running
docker container stop <hash>           # Gracefully stop the specified container
docker container kill <hash>         # Force shutdown of the specified container
docker container rm <hash>        # Remove specified container from this machine
docker container rm $(docker container ls -a -q)         # Remove all containers
docker image ls -a                             # List all images on this machine
docker image rm <image id>            # Remove specified image from this machine
docker image rm $(docker image ls -a -q)   # Remove all images from this machine
docker login             # Log in this CLI session using your Docker credentials
docker tag <image> username/repository:tag  # Tag <image> for upload to registry
docker push username/repository:tag            # Upload tagged image to registry
docker run username/repository:tag                   # Run image from a registry