Monday, October 23, 2023

Oracle Restore Single table to another name

 Note:  It does not work in moving it on another schema



impdp dav4635/Wolver1n323$    tables=PPM.EP_PPM_PROJECT_ALLOCATIONS remap_table=PPM.EP_PPM_PROJECT_ALLOCATIONS:MYTABLE1 dumpfile=16restore.dmp logfile=dx.log

Tuesday, September 26, 2023

Troubleshoot SQL Server Email

 EXEC msdb.dbo.sp_send_dbmail

    @profile_name = 'CVTMSsQLT02 Mailer'

   ,@recipients = 'dexter.velasco@aei.com'

   ,@subject = 'Email from SQL Server'

   ,@body = 'This is my First Email sent from SQL Server :)'

   ,@importance ='HIGH'

GO


SELECT * FROM msdb.dbo.sysmail_unsentitems


SELECT * FROM msdb.dbo.sysmail_sentitems


SELECT * FROM msdb.dbo.sysmail_faileditems


EXEC msdb.dbo.sysmail_help_profile_sp;


select


 err.[description] ,


 fail.*


FROM [msdb].[dbo].[sysmail_event_log] err


 inner join [msdb].dbo.sysmail_faileditems fail


 On err.mailitem_id = fail.mailitem_id;


sysmail_help_profile_sp

ListTablesWithColumnLike

 SELECT      COLUMN_NAME AS 'ColumnName'

            ,TABLE_NAME AS  'TableName'

FROM        INFORMATION_SCHEMA.COLUMNS

WHERE       COLUMN_NAME LIKE '%whatever%'

ORDER BY    TableName

            ,ColumnName;

Thursday, August 24, 2023

DOCKER TUTORIAL :Build an image from an existing application


 This is a docker tutorial for beginners and is part of a series.


In this module, we will tackle the following:

1. Download a sample application from github.

2. Create a docker image of the sample application

3. Run the application from your laptop.


You will need a github account and desktop docker installed on your machine.


Steps

1. Go to https://github.com/nigelpoulton/gsd

2. Open either a command prompt or a powershell from your machine.

3. Create a directory D:\dxdockerlabs and clone the source code from github to this directory

4. git clone https://github.com/nigelpoulton/gsd.git



D:\>mkdir dxdockerlabs

D:\>cd dxdockerlabs

D:\dxdockerlabs>dir
 Volume in drive D is DATA
 Volume Serial Number is 94A8-535C

 Directory of D:\dxdockerlabs

08/25/2023  10:32 AM    <DIR>          .
08/25/2023  10:32 AM    <DIR>          ..
               0 File(s)              0 bytes
               2 Dir(s)  53,273,477,120 bytes free

D:\dxdockerlabs>git clone https://github.com/nigelpoulton/gsd.git
Cloning into 'gsd'...
remote: Enumerating objects: 125, done.
remote: Counting objects: 100% (125/125), done.
remote: Compressing objects: 100% (89/89), done.
remote: Total 125 (delta 47), reused 88 (delta 33), pack-reused 0
Receiving objects: 100% (125/125), 310.31 KiB | 2.67 MiB/s, done.
Resolving deltas: 100% (47/47), done.

D:\dxdockerlabs>dir
 Volume in drive D is DATA
 Volume Serial Number is 94A8-535C

 Directory of D:\dxdockerlabs

08/25/2023  10:32 AM    <DIR>          .
08/25/2023  10:32 AM    <DIR>          ..
08/25/2023  10:32 AM    <DIR>          gsd
               0 File(s)              0 bytes
               3 Dir(s)  53,272,608,768 bytes free

D:\dxdockerlabs>

5. The gsd folder you just downloaded contains the source code of the application. This can be any other source code of any application. The only difference is this folder contains a 'dockerfile' that tells what docker has to do. The creation of the dockerfile will not be discussed in this series since this is for beginners

6.  Create the docker image for this application. Go to the directory where the dockerfile is located.  Use

docker image build -t dextervelascoaei/gsd:first-ctr . 

where: 
docker account: dextervelascoaei
directory:gsd
container name: first-ctr 

D:\dxdockerlabs>cd gsd

D:\dxdockerlabs\gsd>dir
 Volume in drive D is DATA
 Volume Serial Number is 94A8-535C

 Directory of D:\dxdockerlabs\gsd

08/25/2023  10:32 AM    <DIR>          .
08/25/2023  10:32 AM    <DIR>          ..
08/25/2023  10:32 AM    <DIR>          compose
08/25/2023  10:32 AM    <DIR>          container
08/25/2023  10:32 AM    <DIR>          first-container
08/25/2023  10:32 AM    <DIR>          multi-container
08/25/2023  10:32 AM             1,077 README.md
08/25/2023  10:32 AM    <DIR>          swarm
08/25/2023  10:32 AM    <DIR>          swarm-stack
               1 File(s)          1,077 bytes
               8 Dir(s)  53,272,608,768 bytes free

D:\dxdockerlabs\gsd>cd first-container

D:\dxdockerlabs\gsd\first-container>dir
 Volume in drive D is DATA
 Volume Serial Number is 94A8-535C

 Directory of D:\dxdockerlabs\gsd\first-container

08/25/2023  10:32 AM    <DIR>          .
08/25/2023  10:32 AM    <DIR>          ..
08/25/2023  10:32 AM               678 app.js
08/25/2023  10:32 AM               675 Dockerfile
08/25/2023  10:32 AM               361 package.json
08/25/2023  10:32 AM                62 readme.md
08/25/2023  10:32 AM    <DIR>          static
08/25/2023  10:32 AM    <DIR>          views
               4 File(s)          1,776 bytes
               4 Dir(s)  53,272,608,768 bytes free

[+] Building 3.4s (11/11) FINISHED
 => [internal] load .dockerignore                                                                               0.0s
 => => transferring context: 2B                                                                                 0.0s
 => [internal] load build definition from Dockerfile                                                            0.0s
 => => transferring dockerfile: 714B                                                                            0.0s
 => [internal] load metadata for docker.io/library/node:current-alpine                                          3.2s
 => [auth] library/node:pull token for registry-1.docker.io                                                     0.0s
 => [1/5] FROM docker.io/library/node:current-alpine@sha256:f62abc08fe1004555c4f28b6793af8345a76230b21d2d24997  0.0s
 => [internal] load build context                                                                               0.0s
 => => transferring context: 106.31kB                                                                           0.0s
 => CACHED [2/5] RUN mkdir -p /usr/src/app                                                                      0.0s
 => CACHED [3/5] COPY . /usr/src/app                                                                            0.0s
 => CACHED [4/5] WORKDIR /usr/src/app                                                                           0.0s
 => CACHED [5/5] RUN npm install                                                                                0.0s
 => exporting to image                                                                                          0.0s
 => => exporting layers                                                                                         0.0s
 => => writing image sha256:602dc8e5704e61c5c69716a600377a3c5e5ad1650b93da1bb7eb5d7151443d46                    0.0s
 => => naming to docker.io/dextervelascoaei/gsd:first-ctr                                                       0.0s

D:\dxdockerlabs\gsd\first-container>

7. Now we can run the container with 
'docker container run -d --name ironman -p 8000:8080 dexteravelascoaei/gsd:first-ctr '

D:\dxdockerlabs\gsd\first-container>docker container run -d --name ironman -p 8000:8080 dexteravelascoaei/gsd:first-ctr
Unable to find image 'dexteravelascoaei/gsd:first-ctr' locally
first-ctr: Pulling from dexteravelascoaei/gsd
8921db27df28: Pull complete
361849a0fe60: Pull complete
4769d5fe84bb: Pull complete
1a33b52da743: Pull complete
5a34b913c9b2: Pull complete
1f6ea1c63c6e: Pull complete
142e7c5f35df: Pull complete
Digest: sha256:e6fa6e4beffb094719b24bfd48f38059f62d6d65f699504955a3f618d1159384
Status: Downloaded newer image for dexteravelascoaei/gsd:first-ctr
352c22010c8402dee23e2cf76af7f2978dce170081c8f1917508b3afaa3f5bad

8. go to localhost:8000



9. Finished

Docker for Beginners


 This tutorial is docker for beginners.


Outline

  1. Environment set up
    1. Install Docker for Desktop
    2. Install Git
  2. Build an image from an existing application
  3. Push an image to a repository
  4. Start a container with the image
  5. Stop the container
  6. Delete the image
  7. test 

Wednesday, August 2, 2023

Monday, June 19, 2023

ERROR: failed to solve: process "/bin/sh -c dotnet restore \"./AzureNamingTool.csproj\"" did not complete successfully: exit code: 1

 C:\Users\Dexter.Velasco\Downloads\CloudAdoptionFramework-master\CloudAdoptionFramework-master\ready\AzNamingTool>dotnet restore --interactive

Could not execute because the application was not found or a compatible .NET SDK is not installed.

Possible reasons for this include:

  * You intended to execute a .NET program:

      The application 'restore' does not exist.

  * You intended to execute a .NET SDK command:

      It was not possible to find any installed .NET SDKs.

      Install a .NET SDK from:

        https://aka.ms/dotnet-download


Install version 7



Monday, May 29, 2023

When VPN cannot connect to internet

 check your Control Panel\Network and Internet\Network Connections goto properties uncheck ipv6

Thursday, March 30, 2023

Oracle import table only

 HKGDBAPPT01|epwebd:/dbapps/oracle/admin/epwebd/dpdump>impdp dav4635 tables=COMPASS.EC_CAR_AR_HISTORY directory=DATA_PUMP_DIR dumpfile=expdpWEBP-2903202317.dmp logfile=impdx.log table_exists_action=truncate



HKGDBAPPT01|epwebd:/dbapps/oracle/admin/epwebd/dpdump>impdp dav4635 tables=COMPASS.EC_CAR_AR_HISTORY,COMPASS.EC_CAR_ATTACHMENT,COMPASS.EC_CAR_PERSON,COMPASS.EC_CAR_PERSON_ROLES directory=DATA_PUMP_DIR dumpfile=expdpWEBP-2903202317.dmp logfile=impdx.log table_exists_action=truncate


Import: Release 18.0.0.0.0 - Production on Thu Mar 30 02:02:39 2023

Version 18.9.0.0.0


Copyright (c) 1982, 2018, Oracle and/or its affiliates.  All rights reserved.

Password:


Connected to: Oracle Database 18c Standard Edition 2 Release 18.0.0.0.0 - Production

Master table "DAV4635"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded

Starting "DAV4635"."SYS_IMPORT_TABLE_01":  dav4635/******** tables=COMPASS.EC_CAR_AR_HISTORY,COMPASS.EC_CAR_ATTACHMENT,COMPASS.EC_CAR_PERSON,COMPASS.EC_CAR_PERSON_ROLES directory=DATA_PUMP_DIR dumpfile=expdpWEBP-2903202317.dmp logfile=impdx.log table_exists_action=truncate

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE

Table "COMPASS"."EC_CAR_AR_HISTORY" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate

Table "COMPASS"."EC_CAR_PERSON" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate

Table "COMPASS"."EC_CAR_ATTACHMENT" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate

Table "COMPASS"."EC_CAR_PERSON_ROLES" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA

. . imported "COMPASS"."EC_CAR_AR_HISTORY"               10.12 MB   56971 rows

. . imported "COMPASS"."EC_CAR_PERSON_ROLES"             5.554 KB       4 rows

. . imported "COMPASS"."EC_CAR_ATTACHMENT"               1.063 MB    5657 rows

. . imported "COMPASS"."EC_CAR_PERSON"                   9.679 KB      19 rows

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/CONSTRAINT

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS

Processing object type DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS

Processing object type DATABASE_EXPORT/STATISTICS/MARKER

Job "DAV4635"."SYS_IMPORT_TABLE_01" successfully completed at Thu Mar 30 02:02:54 2023 elapsed 0 00:00:07


HKGDBAPPT01|epwebd:/dbapps/oracle/admin/epwebd/dpdump>


Tuesday, March 7, 2023

SQL Server Performance Tuning

 Hello Arvin,

 

Good question.

 

Firstly, for access on the system we can talk with AE if they have spare licenses for you.

 

Secondly, a good database design will always have Primary Keys on all tables to protect data integrity and they inherently function better than an index.

This makes Primary and Foreign keys perform really good when your queries have joins.

 

Thirdly, looking at HKDOAMSP you will see the query below that has the most waits.  On this instance it was 4473599451.

 



 

 

From here we now know its is caused by HKGDOAMSP.dbo.fn_GetEmployeewithNickname and the major wait is MEMORY/CPU taking 26 mins of the 1 hour span.

 

Investigate the query used on the function. Check for

  1. Primary Keys
  2. Foreign Keys
  3. Indexes
  4. Cartesian Products

 

Quick Note: Primary and Foreign keys are designed for business reasons. They should not created for performance gains.

 

 

Tuning Proper

 

Query 1:

select @result =trim(first_name) + ' ' + trim(last_name) + case when NickName is null then '' else ' ('+ NickName +')' end

from doams_SAPFeed  where SAPID = @empid

 

Upon investigation I saw that doams_SAPFeed has no PK.  So I asked Lex which should be unique in this table. (Hence the business reason part).

 

I have experienced that sometimes ETL loads from the source are wrong or it was not properly transformed prior to loading. On this case I don’t know which one.

This is where PK, FK, Constraints and Triggers comes in to play during database architecture.  The database architecture is the foundation for the business process  and then followed by the application/s

 

Moving on.

 

Take a baseline of the query on SSMS

 

Get the timing, io and the execution plan

 

(1 row affected)

Table 'doams_SAPFeed'. Scan count 1, logical reads 2683, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

(1 row affected)

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 17 ms.

 

 



 

This tells us it is reading 2683 from memory ( which is what we want, what we don’t want is to many physical reads from disk).

Also we a table scan but since the table is small it should not be a big of a deal.

 

Let’s see if we can improve it.

 

Next is to have a hypothesis on how to reduce the overhead of the query.

 

Hypothesis:  adding a PRIMARY KEY on sapid on doams_SAPFeed.

 

On the HKDOAMSP environment we have to clean it first since SAPID '33905','34006','34038'  are duplicates

 

So I created my own tables dxfeed and dxfeed2 which has a PK.

 

Test the hypothesis

 

Now run them side by side on SMS

 

select * from dxfeed where SAPID='36859';

select * from dxfeed2 where SAPID='36859' ;

 

Results

Table 'dxfeed'. Scan count 1, logical reads 2161, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

(1 row affected)

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 12 ms.

SQL Server parse and compile time:

   CPU time = 0 ms, elapsed time = 0 ms.

 

(1 row affected)

Table 'dxfeed2'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

(1 row affected)

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

SQL Server parse and compile time:

   CPU time = 0 ms, elapsed time = 0 ms.

 

Execution Plan

 



 

This exhibits that Query 2, which has the PK has zero cost compared to Query 1.

 

Conclusion:

 

Adding a primary key on SAPID on doams_SAPFeed is more efficient for the reasons that

  1. Query 2 has 2,158 less logical reads than Query 1.
  2. Query 2 used index seek as opposed to a table scan.

 

Recommendation:

 

  1. Test it on a dev environment and check your application if there are unforeseen effects.
  2. Validate with the business owners that SAPID should be unique in doams_SAPFeed.
  3. Deploy to all applicable DOAMS environment.

 

This is a very used function it runs

 

5:00 PM-6:00 PM (7,798 execs)
6:00 PM-7:00 PM
 (87,515 execs)
8:00 PM-9:00 PM
 (92,800 execs)

Thursday, February 9, 2023

Ping with timestamp

 Run this on power shell


Ping.exe -t spiceworks.com | ForEach {"{0} - {1}" -f (Get-Date),$_}

Tuesday, January 24, 2023

List Resumable Index SQL SERVER

 --List Current Resumable Index

ALTER INDEX IX_ProdDate_v6 ON T_TDHEADER REBUILD 

WITH (ONLINE = ON, RESUMABLE = ON, MAXDOP=8);


ALTER INDEX IX_ProdDate_v6 ON T_TDHEADER REBUILD 

WITH (ONLINE = ON, RESUMABLE = ON);


SELECT 

name as index_name,

percent_complete, 

state_desc, 

start_time,

last_pause_time, 

total_execution_time AS ExecutionMin,

-- execution time times 

-- ratio of percent to complete and precent completed

total_execution_time * (100.0-percent_complete)/percent_complete  

    AS ApproxExecutionMinLeft 

FROM sys.index_resumable_operations;


ALTER INDEX IX_ProdDate_v6 ON T_TDHEADER PAUSE;


ALTER INDEX IX_ProdDate_v6 ON T_TDHEADER RESUME;


ALTER INDEX IX_ProdDate_v6 ON T_TDHEADER ABORT;


List Partition SQL SERVER

-- List Partition by DAV

  SELECT distinct f.NAME AS file_group_name,

SCHEMA_NAME(t.schema_id) AS table_schema,

t.name AS table_name,

p.partition_number,

ISNULL(CAST(left_prv.value AS VARCHAR(MAX))+ CASE WHEN pf.boundary_value_on_right = 0 THEN ' < '

ELSE ' <= '

END , '-INF < ')

+ 'X' + ISNULL(CASE WHEN pf.boundary_value_on_right = 0 THEN ' <= '

ELSE ' < '

END + CAST(right_prv.value AS NVARCHAR(MAX)), ' < INF') AS range_desc,

p.rows AS NumberOfRows,

pf.boundary_value_on_right,

ps.name AS partition_schem_name,

pf.name AS partition_function_name,

left_prv.value AS left_boundary,

right_prv.value AS right_boundary 

FROM sys.partitions p

JOIN sys.tables t

ON p.object_id = t.object_id

JOIN sys.indexes i

ON p.object_id = i.object_id

AND p.index_id = i.index_id

JOIN sys.allocation_units au

ON p.hobt_id = au.container_id

JOIN sys.filegroups f

ON au.data_space_id = f.data_space_id

LEFT JOIN sys.partition_schemes ps

ON ps.data_space_id = i.data_space_id

LEFT JOIN sys.partition_functions pf

ON ps.function_id = pf.function_id

LEFT JOIN sys.partition_range_values left_prv

ON left_prv.function_id = ps.function_id

AND left_prv.boundary_id + 1 = p.partition_number

LEFT JOIN sys.partition_range_values right_prv

ON right_prv.function_id = ps.function_id

AND right_prv.boundary_id = p.partition_number

where t.name in ('Partitioned_TShippment','STAGE_TShippment');

List Tables on SQL Server

 SELECT

  name,

  crdate

FROM

  SYSOBJECTS

WHERE

  xtype = 'U' and name like '%Shi%';

GO


select * from CommandLog where command like 'ALTER INDEX%';