Thursday, 6 March 2025

how to create dml dynamically in Ab-initio

 $[

begin


let int i = 0;

let string(int) complete_list = "emp_nm,billable,age,designation,location";

let string(int) file_content = read_file(AI_SERIAL+'/'+'meta_dml.dat');

let string(int) first_line = string_split(file_content, 'n')[0];

let int len = length_of(string_split(first_line, "|"));

let string(int)[int] current_list = vector_slice(string_split(complete_list, ","), 0, len -1);

let int curr_len = length_of(current_list);

let string("") fld_nm = "";

let string(int) dml_data= "";

let dml_field_info curr_rec_info;

let dml_field_info[int] fld_vec = [vector];




for (i, i < curr_len)

begin

fld_nm = current_list;


curr_rec_info = if ( i == len -1) make_field(fld_nm, "string('n')") else make_field(fld_nm, "string('|')");

fld_vec = vector_append(fld_vec, curr_rec_info);


end


dml_data = add_fields(source_type = "recordnend", field_vec = fld_vec);

result :: dml_data;

end

---------------------------------------------------------------------------------------------------------------------

How to Create a Dynamic DML for a Input File?


My INPUT FILE : 

cust_name|cust_id|acct_number 

XYZ|001|19874578 

LMN|002|123456


PDL PARAMETER INPUT_DML; 


$[begin 

let string("") [] data = string_split(string_split(read_file("/home/vj 30954/test_ganesha1.dat"),"\n")[0], " ;|"); 

let decimal("") l=0; 

let string("") r; 

let string("") e ="record\n"; 

for(l,l<length_of(data)-1) 

begin 

e=string_concat(e,'string("|")',data[l], ";","\n"); 

end 

r=string_concat(e,'string(','"\\n")',dat a[ length_of(data)-1],";","\n"); result::string_concat(r,"end"); end] 


Above PDL will generate o/p like below : 


recod 

string("|') cust_name; 

string("|") cust_id; 

string("\n") acct_number; 

end


Sunday, 2 March 2025

Total counts of Records using Reformat

 let decimal(15) tot_rec_count=0;

/Reformat operation/
out::reformat(in)=
begin
if(next_in_sequence()>0)

begin
tot_rec_count = tot_rec_count + 1 ;
end
out.count::tot_rec_count;

end;

log_event_t log :: final_log_output() =
begin
log.event_type :: “Summary”;
log.event_text :: string_concat("Total No of Records = ", tot_rec_count);
end;

 After reformat Trash the flow. Collect your rec count in Log port.

Wednesday, 5 February 2025

Difference Between Primary Key and Foreign Key

comparison between Primary Key and Foreign Key:

Feature Primary Key Foreign Key
Definition Uniquely identifies a record in a table Establishes a relationship between two tables
Uniqueness Must be unique for each row Can have duplicate values
NULL Values Cannot be NULL Can have NULL values
Number of Keys in a Table Only one primary key per table A table can have multiple foreign keys
Relation Defines the main identifier for a record Refers to the primary key of another table
Dependency Independent, does not depend on any other table Always refers to a primary key in another table
Indexing Automatically indexed Not automatically indexed
Modification Cannot be modified once assigned Can be updated, but should match a valid primary key
Example In Employees table: Emp_ID (Primary Key) In Salaries table: Emp_ID (Foreign Key referencing Employees.Emp_ID)

Example in SQL

1. Primary Key Example (Employees Table)

CREATE TABLE Employees (
    Emp_ID INT PRIMARY KEY,
    Name VARCHAR(50)
);
  • Emp_ID is the Primary Key, ensuring each employee has a unique identifier.

2. Foreign Key Example (Salaries Table)

CREATE TABLE Salaries (
    Salary_ID INT PRIMARY KEY,
    Emp_ID INT,
    Amount DECIMAL(10,2),
    FOREIGN KEY (Emp_ID) REFERENCES Employees(Emp_ID)
);
  • Emp_ID in Salaries is a Foreign Key linking to Emp_ID in Employees, ensuring data consistency.

Saturday, 25 January 2025

Tar command

 The tar command in Linux/Unix is used to create, extract, and manage archive files. It’s commonly used to combine multiple files or directories into a single file, often for backup or distribution purposes.


Key Concepts:

Tar archives are often compressed to save space (e.g., .tar.gz or .tgz files).

Think of it like “zipping” and “unzipping” files, but specifically for Unix/Linux systems.


Basic Syntax:


tar [options] [archive_name] [file_or_directory]


options: What you want to do (e.g., create, extract, list, etc.).

archive_name: The name of the archive file (e.g., backup.tar).

file_or_directory: The files or directories to include in the archive.


Common Examples:

1. Create a tar archive:


tar -cvf archive.tar file1 file2 directory/


-c: Create an archive.

-v: Verbose (show what’s happening).

-f: Specify the file name of the archive (archive.tar).


2. Extract a tar archive:


tar -xvf archive.tar


-x: Extract files from an archive.

-v: Verbose (list files while extracting).

-f: Specify the archive file name.


3. Compress a tar archive (gzip):


tar -cvzf archive.tar.gz file1 file2 directory/


-z: Compress the archive using gzip.

Creates a .tar.gz file.


4. Extract a compressed tar archive:


tar -xvzf archive.tar.gz


-z: Decompress the gzip-compressed archive.


5. List contents of a tar archive:


tar -tvf archive.tar


-t: List the files in the archive.


6. Add files to an existing tar archive:


tar -rvf archive.tar newfile


-r: Append files to an archive.


Real-World Examples:

Backup a directory:


tar -cvzf backup.tar.gz /home/user/documents


This creates a compressed backup of the documents directory.


Extract a downloaded file:


tar -xvzf software.tar.gz


Extracts the contents of a compressed .tar.gz file.


Transfer files as one package:

Combine files into a single tar file, send it over, and extract it on another machine.


Quick Reference for Options:


Option Description

-c Create an archive

-x Extract files from an archive

-v Verbose (show progress)

-f Specify archive file name

-z Compress or decompress using gzip

-t List contents of an archive

-r Append files to an existing archive


In short, tar is a versatile tool for packing, unpacking, and compressing files!


Friday, 24 January 2025

egrep (Extended grep) in Unix

 You use egrep (Extended grep) in Unix when you need to search for patterns using extended regular expressions (ERE). It supports additional features like +, ?, |, and {} directly without escaping.

Example Use Cases:

  1. Multiple Patterns:

    egrep 'apple|orange|banana' file.txt
    

    Matches any line containing "apple", "orange", or "banana".

  2. Repetition:

    egrep 'a{2,}' file.txt
    

    Matches lines with "a" repeated 2 or more times.

  3. Optional Characters:

    egrep 'colou?r' file.txt
    

    Matches "color" or "colour".

  4. Find Lines with a Specific Range of Characters: egrep '[0-9]{3,5}' file.txt

          Matches lines containing a number with 3 to 5 digits (e.g., 123, 12345).
-------------------------------------------------------------------------------------------------------------

Find Lines with Repeated Patterns:
egrep '(ab)+c' file.txt

Matches lines containing "ab" repeated one or more times, followed by "c" (e.g., "abc", "abababc").

-------------------------------------------------------------------------------------------------------------

Match Words with Optional Characters:

egrep 'gr(e|a)y' file.txt

Matches "grey" or "gray".

-------------------------------------------------------------------------------------------------------------

Search for Lines with a Specific Word or Another (Logical OR):

egrep 'cat|dog' file.txt
or
grep -E 'cat|dog' file.txt

Matches lines containing either "cat" or "dog".

Why Use egrep?

egrep is simply a shorthand for grep -E, so using it is more convenient for extended regular expressions. However, newer systems recommend grep -E because egrep is considered deprecated in some environments.

Use egrep when complex pattern matching is needed!

Tuesday, 21 January 2025

DROP, DELETE, and TRUNCATE

Here’s a comparison of DROP, DELETE, and TRUNCATE in SQL:


Aspect DROP DELETE TRUNCATE
Purpose Removes an entire table or database. Removes specific rows from a table. Removes all rows from a table.
DML/DDL DDL (Data Definition Language). DML (Data Manipulation Language). DDL (Data Definition Language).
Deletes Data Only? Deletes both structure and data. Deletes data, keeps table structure. Deletes data, keeps table structure.
WHERE Clause Not applicable. Can use WHERE to filter rows. Cannot use WHERE.
Performance Faster for dropping structure. Slower for large data (logged). Faster (minimal logging).
Rollback Support Cannot be rolled back. Can be rolled back if in a transaction. Cannot be rolled back.
Resets Identity? Yes, as table is removed. No, retains identity sequence. Yes, resets identity sequence.

Examples:

  1. DROP:

    DROP TABLE employees;
    
  2. DELETE:

    DELETE FROM employees WHERE department_id = 101;
    
  3. TRUNCATE:

    TRUNCATE TABLE employees;
    


**SQL query performance optimization techniques**

 1. **Use Indexing**:

   - Add indexes on frequently filtered or joined columns.


2. **Avoid SELECT ***:

   - Retrieve only required columns.


3. **Use Joins Efficiently**:

   - Prefer inner joins over outer joins when possible.

   - Join on indexed columns.


4. **Optimize WHERE Clauses**:

   - Avoid functions on columns in `WHERE` (e.g., `WHERE YEAR(date) = 2023` → `WHERE date >= '2023-01-01'`).


5. **Use EXISTS Instead of IN**:

   - `EXISTS` is often faster for subqueries.


6. **Limit the Rows**:

   - Use `LIMIT` or `TOP` to fetch only needed rows.


7. **Partition Large Tables**:

   - Use table partitioning for large datasets.


8. **Avoid Correlated Subqueries**:

   - Replace them with joins or CTEs.


9. **Use Query Execution Plans**:

   - Analyze and tune queries based on execution plans.


10. **Batch Updates and Inserts**:

    - Split large operations into smaller batches.


These techniques improve query performance in real-world scenarios.


how to create dml dynamically in Ab-initio

 $[ begin let int i = 0; let string(int) complete_list = "emp_nm,billable,age,designation,location"; let string(int) file_content ...