Getting random LCK_M_SCH_M on convert and bulk insert task

I starting getting random LCK_M_SCH_M locks with huge wait time, which hung my etl proccess.
The ssis package runs like this:
I have 4 containers that run in parallel and do the same thing:
-Convert a tab delimited file from unicode->utf8
-Truncate the table (within a foreach loop)
-Bulk insert the data
Also transactionoption is set to NotSupported.
What could be causing the lock?
All foreach loops do not overlap ragarding tables/files.
Do they contest somehow?
Elias

The truncate table command imposes the schema lock so you will have to not to run in parallel this task
Arthur
MyBlog
Twitter

Similar Messages

  • How to get current month from filename and bulk insert from text file into table?

    I set up some dynamic SQL to help my bulk copy data from a text file to a table.  This works fine for files that come in every day; I get the previous day’s data, based on the file name that’s placed
    in the folder.  That’s why I’m using the ‘-1’.  The dates will look like this: '20140131', so I'm using type 112.
    declare @fullpath1 varchar(1000)
    select @fullpath1 = '''\\system.local\ms\london\FTP\' + convert(varchar, getdate()-1, 112) + '_INDEXPRICES_EOM.SPC'''
    declare @cmd1 nvarchar(1000)
    print (@cmd1)
    select @cmd1 = 'bulk insert [dbo].[SB_Monthly] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR=''0x0a'')'
    print(@cmd1)
    exec (@cmd1)
    I think the syntax will be somewhat similar to this:
    YEAR(date_column)=YEAR(getdate()) AND MONTH(date_column)=MONTH(getdate())
    I’m not totally sure how to incorporate that into my current syntax.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    I tried a couple versions of this.
    Declare @StartDate Date, @EndDate Date
    Select @StartDate = convert(varchar, getdate()-28, 112), @EndDate = convert(varchar, getdate()-1, 112)
    BEGIN
    declare @fullpath1 varchar(1000)
    select @fullpath1 = '''\\ms\london\FTP\' + ''' between ''' + Convert(Varchar(10), @StartDate, 101) + ''' and ''' + Convert(Varchar(10), @EndDate, 101) + '''_SP.SPC'''
    declare @cmd1 nvarchar(1000)
    print (@cmd1)
    select @cmd1 = 'bulk insert [dbo].[SPBMI_Monthly] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR=''0x0a'')'
    print(@cmd1)
    exec (@cmd1)
    END
    Here’s the string:
    bulk insert [dbo].[SPBMI_Monthly] from '\\ms\london\FTP\' between '02/03/2014' and '03/02/2014'_SP.SPC' with (FIELDTERMINATOR = '\t', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR='0x0a')
    The error message I keep getting is:
    Msg 156, Level 15, State 1, Line 1
    Incorrect syntax near the keyword 'between'.
    Msg 319, Level 15, State 1, Line 1
    Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
    I feel like I’m already pushing this thing to the limit. 
    Maybe this last part isn’t possible.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Blob truncated with DbFactory and Bulk insert

    Hi,
    My platform is a Microsoft Windows Server 2003 R2 Server 5.2 Service Pack 2 (64-bit) with an Oracle Database 11g 11.1.0.6.0.
    I use the client Oracle 11g ODAC 11.1.0.7.20.
    Some strange behavior appends when used DbFactory and bulk command with Blob column and parameter with a size larger than 65536bytes. Let me explain.
    First i create a dummy table in my schema :
    create table dummy (a number, b blob)To use bulk insert we can use the code A with oracle object (succes to execute) :
    byte[] b1 = new byte[65530];
    byte[] b2 = new byte[65540];
    Oracle.DataAccess.Client.OracleConnection conn = new Oracle.DataAccess.Client.OracleConnection("User Id=login;Password=pws;Data Source=orcl;");
    OracleCommand cmd = new OracleCommand("insert into dummy values (:p1,:p2)", conn);
    cmd.ArrayBindCount = 2;
    OracleParameter p1 = new OracleParameter("p1", OracleDbType.Int32);
    p1.Direction = ParameterDirection.Input;
    p1.Value = new int[] { 1, 2 };
    cmd.Parameters.Add(p1);
    OracleParameter p2 = new OracleParameter("p2", OracleDbType.Blob);
    p2.Direction = ParameterDirection.Input;
    p2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(p2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();We can write the same thing with an abstract level when used the DbProviderFactories (code B) :
    var factory = DbProviderFactories.GetFactory("Oracle.DataAccess.Client");
    DbConnection conn = factory.CreateConnection();
    conn.ConnectionString = "User Id=login;Password=pws;Data Source=orcl;";
    DbCommand cmd = conn.CreateCommand();
    cmd.CommandText = "insert into dummy values (:p1,:p2)";
    ((OracleCommand)cmd).ArrayBindCount = 2;
    DbParameter param = cmd.CreateParameter();
    param.ParameterName = "p1";
    param.DbType = DbType.Int32;
    param.Value = new int[] { 3, 4 };
    cmd.Parameters.Add(param);
    DbParameter param2 = cmd.CreateParameter();
    param2.ParameterName = "p2";
    param2.DbType = DbType.Binary;
    param2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(param2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();But this second code doesn't work, the second array of byte is truncated to 4byte. It seems to be an int16 overtaking.
    When used a DbTYpe.Binary, oracle use an OracleDbType.Raw for mapping and not an OracleDbType.Blob, so the problem seems to be with raw type, BUT if we use the same code without bulk insert, it's worked !!! The problem is somewhere else...
    Why used an DbConnection ? To be able to switch easy to an another database type.
    So why used "((OracleCommand)cmd).ArrayBindCount" ? To be able to used specific functionality of each database.
    I can fix the issue when casting DbParameter as OracleParameter and fix the OracleDbType to Blob, but why second code does not working with bulk and working with simple query ?

    BCP and BULK INSERT does not work the way you expect them do. What they do is that they consume fields in a round-robin fashion. That is, they first looks for data for the first field, then for the second field and so on.
    So in your case, they will first read one byte, then 20 bytes etc until they have read the two bytes for field 122. At this point they will consume bytes until they have found a sequence of carriage return and line feed.
    You say that some records in the file are incomplete. Say that there are only 60 fields in this file. Field 61 is four bytes. BCP and BULK INSERT will now read data for field 61 as CR+LF+the first two bytes in the next row. CR+LF has no special meaning,
    but they are just data at this point.
    You will have to write a program to parse the file, or use SSIS. But BCP and BULK INSERT are not your friends in this case.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Hello! Prompt, if after the arrival in other country to get local SIM a card and to insert it in iPhone, it thus will change the options and the language interface?

    Hello! Prompt, if after the arrival in other country to get local SIM a card and to insert it in iPhone, it thus will change the options and the language interface?

    It should automatically ajust but you may not be able to use it if the country you're going to uses a different network service. The two main services are CDMA and GSM. iPhone 4S can connect to both, but iPhone 4's come in separate CDMA and GSM models. USA, China, Egypt and some other countries use CDMA so if you bought your iPhone from one of these countries, it will not work in most European countries or Australia because they use GSM

  • How I get random number between 1 and int number ?

    how can I get random number between 1 and int number
    like between 1 and 10 etc.
    10x

    Use the nextFloat() method of the Random class, this returns a random number between 0.0 and 1.0. To get and integer range from that multiply the return value of that method call by the integer range you need (in this case 10), and add the starting number you want returned in the range (1 in this case). The code below works for numbers 1-10.
    Random r = new Random();
    int x = (int)(r.nextFloat()*10) +1;

  • Bulk Insert Task Cannot bulk load because the file could not be opened.operating system error error code 3(The system cannot find the path specified.)

    Following error i am getting after i chnaged the Path in Config File from
    \\vs01\d$\\Deployment\Files\temp.txt
    to
    C:\Deployment\Files\temp.txt
    [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\Deployment\Files\temp.txt" could not be opened. Operating system error code 3(The system cannot find the path specified.).". 

    I think i know whats going on. The Bulk Insert task runs by executing sql command (bulk insert) internally from the target sql server to load the file. This means that the SQL Server Agent of the target sql server should have permissions on the file you trying to load. This also means that you need to use UNC path instead to specify the file path (if the target server in on different machine)
    Also from BOL (see section Usage Considerations - last bullet point)
    http://msdn.microsoft.com/en-us/library/ms141239.aspx
    * Only members of the sysadmin fixed server role can run a package that contains a Bulk Insert task.
    Make sure you take care of this as well.
    HTH
    ~Mukti
    Mukti

  • Bulk insert task issue

    I Have table,It contains 4 millions records,I want load data into Sql Server table using Bulk Insert task.
    How can i load data using Bulk Insert task.Bulk insert task supports only text source.
    Thanks in Advance.

    If its a sql server table to table transfer You can use data flow task with OLEDB Source and destination. In the OLEDB destination use
    table or view - fast load option as the data access mode. 
    Also if databases are in same server you can even use Execute SQL task with statement like
    INSERT INTO DestTable
    SELECT *
    FROM SourceDB.dbo.SourceTable
    which will be set based
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data
    Yes you can ofourse but dont be in idea that logging will be mininal. Loggign will be as per recovery model full. Every thing will be logged. If you are going to use bulk insert task you can consider switching recovery model to Bulk logged but you will not
    have option to do point in time recovery.
    PS: please dont create duplicate threads
    If you read first Note section in below link it clearly states that yes logging will be full and you can use
    http://technet.microsoft.com/en-us/library/ms191244(v=sql.105).aspx
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • Cannot fetch a row from OLE DB provider "BULK" with bulk insert task

    Hi, folks:
    I created a simple SSIS package. On the Control Flow, I created a Bulk INsert Task with Destination connection to a the local SQL server, a csv file from a local folder, specify comma delimiter. Then I excute the task and I got this long error message.
    [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".

    I got the same error with some additional error details (below).  All I had to do to fix the problem was set the Timeout property for the SQL Server Destination = 0
    I was using the following components:
    SQL Server 2008
    SQL Server Integration Services 10.0
    Data Flow Task
    OLE DB Source – connecting to Oracle 11i
    SQL Server Destination – connecting to the local SQL Server 2008 instance
    Full Error Message:
    Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E14.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".".
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.".
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "The Bulk Insert operation of SQL Server Destination has timed out. Please consider increasing the value of Timeout property on the SQL Server Destination in the dataflow.".
    For SQL Server 2005 there is a hot fix available from Microsoft at http://support.microsoft.com/default.aspx/kb/937545

  • EXECUTE IMMEDIATE AND BULK INSERT

    Hi,
    I have a code
    EXECUTE IMMEDIATE 'INSERT INTO PRC_ExcelDocumentStore_T(object_id,seg_index,segment,value,seg_length) VALUES (:object_id,:seg_index,:segment,:value,:seg_length)'
    USING SELF.object_id,p_index,p_segment,p_value,lengthc(v_value)+2;
    But due to some performance issue ,I can't use like this.Can you please suggest can i use Bulk Inserts instead of 'EXECUTE IMMEDIATE' statement.
    If I can do,please give me an example.
    Thanks in advance

    user10619377 wrote:
    Hi,
    I have a code
    EXECUTE IMMEDIATE 'INSERT INTO PRC_ExcelDocumentStore_T(object_id,seg_index,segment,value,seg_length) VALUES (:object_id,:seg_index,:segment,:value,:seg_length)'
    USING SELF.object_id,p_index,p_segment,p_value,lengthc(v_value)+2;
    But due to some performance issue ,I can't use like this.Can you please suggest can i use Bulk Inserts instead of 'EXECUTE IMMEDIATE' statement.
    If I can do,please give me an example.How can we give example when we don't know what you have now or what exactly needs to be done.?

  • Bug in Bulk Insert?

    So, I'm working with a client today and we discovered some missing records from his file.  I looked into it a bit; the first 5 rows were being truncated.  I thought that is bizarre because the data starts on row 7.  As such, I set up my Bulk
    Insert like this.
    declare @fullpath1 varchar(1000)
    select @fullpath1 = '''\\london-sql\FTP\' + convert(varchar, getdate()- @intFlag , 112) + '_SPGT.SPL'''
    declare @cmd1 nvarchar(1000)
    select @cmd1 = 'bulk insert [dbo].[SPGT_Daily] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 7, ROWTERMINATOR=''0x0a'')'
    exec (@cmd1)
    So, I open this file, which comes from a Unix system, and I get this.
    So, this stock data actually starts on ROW2, not ROW5.  We figured it out pretty quick, and we're all set now.  I'm just not sure why Excel, Wordpad, Notepad, etc, would all show the data starting on ROW7 (field names are in ROW6), and Bulk Insert
    thinks the data is in ROW2 (field names are in ROW1).
    This seems very strange.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    BULK INSERT is a tool to read binary files. In difference to Notepad etc, it is not predisposed towards "lines", but is completely agnostic to the matter. You tell BULK INSERT to insert data into a six-column table with tab as field delimiter and
    \n ias rowterminator. BULK INSERT starts reading bytes until it finds a tab. First field, check. Continues reading bytes until the next tab. Check. And when it comes to the sixth field it read bytes until you see a newline. Check. If it would happen to see
    a newline while looking for a tab, that is just another byte of the data.
    If you apply this way of thinking, you will find that BULK INSERT considered the first six line to be a single record. (Which it unfortunately calls a row.)
    Erland Sommarskog, SQL Server MVP, [email protected]

  • How can I debug a Bulk Insert error?

    I'm loading a bunch of files into SQL server.  All work fine, but one keeps erroring out on me.  All files should be exactly the same in structure, but they have different dates, and other different financial metrics, but the structure and field
    names should be exactly the same.  Nevertheless, one keeps konking out, and throwing this error.
    Msg 4832, Level 16, State 1, Line 1
    Bulk load: An unexpected end of file was encountered in the data file.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    The ROWTERMINATOR should be CRLF, and when you look at it in Notepad++ that's what it looks like, but it must be something else, because I keep getting errors here.  I tried the good old:  ROWTERMINATOR='0x0a'
    That works on all files, but one, so there's something funky going on here, and I need to see what SQL Server is really doing.
    Is there some way to print out a log, or look at a log somewhere?
    Thanks!!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    The first thing to try is to see if BCP likes the file. BCP and BULK INSERT adhere to the same spec, but they are different implementations, but there are subtle differences.
    There is an ERRORFILE option, but it more helps when there is bad data.
    You can also use the BATCHSIZE option to see how many records in the file it swallows, before things go bad. FIRSTROW and LASTROW can also help.
    All in all, it can be quite tedious find that single row where things are different - and where BULK INSERT loses sync entirely. Keep in mind that it reads fields on by one, and it there is one field terminator to few on a line, it will consume the line
    feed at the end of the line as data.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Error while running bulk insert in SSIS package

    Hi:
    I have an error when I am running bulk insert in SSIS package.
    I have implemented an SSIS package to update master data directly from R/3, R/3 gives the file in a specified format, I take this and insert all the records into a temporary table and then update mbr table and process the dimension.
    This works perfectly well in our development system where both our app server and sql server on the same box. But in QAS, the 2 servers are separate and when I try to run the SSIS package I get the below error.
    We have tested all connections and are able to access the path and file from both app server and sql server using the shared folder. Our basis team says that it is a problem with bulk insert task and nothing to do with any authorization.
    Has anyone experienced with this sort of problem in multi server environment? Is there another way to load all data from a file into bespoke table without using bulk insert.
    Thanks,
    Subramania
    Error----
    SSIS package "Package.dtsx" starting.
    Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
    msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
    Task failed: Insert Data Into Staging Table (Account)
    SSIS package "Package.dtsx" finished: Success.
    The program '[2496] Package.dtsx: DTS' has exited with code 0 (0x0).

    Hi Subramania
    From your error:
    Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
    msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
    Let say, server A is where the file entity.csv is located
    Please check the Event Viewer->Security of Server A at the time when the SSIS run, there must be an entry with Logon Failure and find what user was used to access the shared path.
    If your both servers are not in a domain, create the user in server A with the same name and password and grant read access to the shared folder.
    The other workaround is grant read access to Everybody on the shared folder.
    Halomoan
    Edited by: Halomoan Zhou on Oct 6, 2008 4:23 AM

  • SQL Server 2012 Express bulk Insert flat file 1million rows with "" as delimeter

    Hi,
    I wanted to see if anyone can help me out. I am on SQL server 2012 express. I cannot use OPENROWSET because my system is x64 and my Microsoft office suit is x32 (Microsoft.Jet.OLEDB.4.0).
    So I used Import wizard and is not working either. 
    The only thing that let me import this large file, is:
    CREATE TABLE #LOADLARGEFLATFILE
    Column1
    varchar(100), Column2 varchar(100), Column3 varchar(100),
    Column4 nvarchar(max)
    BULK INSERT
    #LOADLARGEFLATFILE
    FROM 'C:\FolderBigFile\LARGEFLATFILE.txt'
    WITH 
    FIRSTROW = 2,
    FIELDTERMINATOR ='\t',
    ROWTERMINATOR ='\n'
    The problem with CREATE TABLE and BULK INSERT is that my flat file comes with text qualifiers - "". Is there a way to prevent the quotes "" from loading in the bulk insert? Below is the data. 
    Column1
    Column2
    Column3
    Column4
    "Socket Adapter"
    8456AB
    $4.25
    "Item - Square Drive Socket Adapter | For "
    "Butt Splice" 
    9586CB
    $14.51
    "Item - Butt Splice"
    "Bleach"
    6589TE
    $27.30
    "Item - Bleach | Size - 96 oz. | Container Type"
    Ed,
    Edwin Lopera

    Hi lgnusLumen,
    According to your description, you use BULK INSERT to import data from a data file to the SQL table. However, to be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
    1. Data fields never contain the field terminator.
    2. Either none or all of the values in a data field are enclosed in quotation marks ("").
    In your data file, the quotes aren't consistent, if you want to prevent the quotes "" from loading in the bulk insert, I recommend you use SQL Server Import and Export Wizard tools in SQL Server Express version. area, it will allow to strip the
    double quote from columns, you can review the following screenshot.
    In other SQL Server version, we can use SQL Server Integration Services (SSIS) to import data from a flat file (.csv) with removing the double quotes. For more information, you can review the following article.
    http://www.mssqltips.com/sqlservertip/1316/strip-double-quotes-from-an-import-file-in-integration-services-ssis/
    In addition, you can create a function to convert a CSV to a usable format for Bulk Insert. It will replace all field-delimiting commas with a new delimiter. You can then use the new field delimiter instead of a comma. For more information, see:
    http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • SSIS BULK INSERT unsing UNC inside of ForEach Loop Container Failed could not be opened. Operating system error code 5(Access is denied.)

    Hi,
    I am trying to figure out how to fix my problem
    Error: Could not be opened. Operating system error code 5(Access is denied.)
    Process Description:
    Target Database Server Reside on different Server in the Network
    SSIS Package runs from a Remote Server
    SSIS Package use a ForEachLoop Container to loop into a directory to do Bulk Insert
    SSIS Package use variables to specified the share location of the files using UNC like this
    \\server\files
    Database Service accounts under the Database is runing it has full permission on the share drive were the files reside.
    In the Execution Results tab shows the prepare SQL statement for the BULK insert and I can run the same exact the bulk insert in SSMS without errors, from the Database Server and from the server were SSIS package is executed.
    I am on a dead end and I don’t want to re-write SSIS to use Data Flow Task because is not flexible to update when metadata of the table changed.
    Below post it has almost the same situation:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8de13e74-709a-43a5-8be2-034b764ca44f/problem-with-bulk-insert-task-in-foreach-loop?forum=sqlintegrationservices

    Insteresting how I fixed the issue, Adding the Application Name into the SQL OLAP Connection String Fixed the issue. I am not sure why SQL Server wasn't able to open the file remotely without this.

Maybe you are looking for

  • How do I get facebook notification sounds to stop?

    I have an iPhone 5.  I have disabled sounds for Facebook under Settings>General>Facebook and under Settings>General>Notifications>Facebook, but I keep getting sounds.  The sounds occur even with the iphone switch set to no sound and the volume at the

  • Regarding Reconciliation a/c

    hi friends, is it possible to transfer balance of reconciliation a/c to new reconciliation a/c. cheerz.. raj

  • Very good.  I have a macbook pro retina 15 "First Generatio.

    Very good. I have a macbook pro retina 15 "First Generation. I read that you can not increase the memory or video card to me is worrying that neither the apple house can do that, because it is way too expensive equipment, not to increase capacity. I

  • Printing JPanel ,how do i do it

    Hi, I am pasting a code below, which will build a screen with 2 panels in a JSCrollPane, added to a JSplitPane in left and right sides, so it will look some thing like below | | | | | | | | | | | | |___|__________| Now i want to provide a funtion to

  • Multiple insance of same transaction (asynchronous mode)

    Hello I created a structured service composed by: - a master transaction executed every 10 minutes which runs several instances (30) of a slave transaction with Asynchronous mode - a slave transaction with duration between 1 minute and 3 minutes. I f