Thread processing in .NET (1)

zhaozj2021-02-17  48

Whether you are developing a computer with a single processor or a computer with multiple processors, you have to provide the application to provide users with the best response performance, even if the application is currently completing other jobs. To enable an application to quickly respond to user operations, one of the most powerful ways between user events or even during user events, the most powerful way is to use multiple execution threads.

Thread and thread processing

The operating system uses the process to separate the different applications they are performing. The thread is an operating system allocated the basic unit of the processor time, and the process can have multiple threads simultaneously execute code. Each thread maintains an exception handler, scheduling priority, and a set of systems for saving the context context before scheduking the thread. The thread context includes all of the information required to continuously perform the desired information in the host process address space of the thread, including the CPU register group and the stack of the thread.

The .NET framework further segments the operating system process into a lightweight hosted processes called application domains by system.appdomain. One or more trust threads (represented by system.threading.thread) can be run in one or any number of application domains in the same non-hosting process. Although each application domain is started with a single thread, the code in the application domain can create additional application domains and additional threads. The result is that the hosted thread can be freely moved between the application domains in the same non-hosting process; you may only move a thread between several applications.

Support for a first multitasking operating system can create multiple threads in multiple processes simultaneously executed. It implements this in the following way: dividing the processor time between the threads that require the processor time, and rotates the processor time slice for each thread. The currently executed thread is suspended at the end of its time, and the other thread continues to run. When the system switches from one thread to another, it saves the thread context of the predetermined thread and reloads the saved thread context of the next thread in the thread queue.

The length of the time sheet depends on the operating system and processor. Since each time the tablets are small, even if there is only one processor, multiple threads seem to be executed at the same time. This is actually the case where the multiprocessor system occurs, in which the thread distribution can be executed in a plurality of available processors.

When to use multiple threads

Software that requires users to interact must react as quickly as possible to provide a variety of user experience. But it must perform the necessary calculations to present data to the user as quickly as possible. If the application uses only one execution thread, you can combine XML Web Services created with .NET remote processing or use ASP.NET to use the processing time of using its own computer, thereby improving the processing time of other computers. The response speed of the user reducs the data processing time of the application. If you are working on a lot of input / output, you can also use I / O to complete ports to improve the response speed of the application.

Advantages of multiple threads

In any case, it is the most powerful technology to increase the response speed to the user and process the required data for almost simultaneous completion. On a computer having a processor, multiple threads can achieve this effect by using a small period of time between the user event to process data in the background. For example, when another thread is recalculating other parts of the spreadsheet in the same application, the user can edit the spreadsheet.

There is no need to modify, and the same application will greatly meet the needs of users when running on a computer with multiple processors. A single application domain can use multiple threads to complete the following tasks:

Communication is communicating with the web server and database via the network.

Perform a lot of time to take up a lot of time.

Distinguish tasks with different priorities. For example, high priority thread management time is key tasks, low priority threads perform other tasks.

The user interface can still respond quickly when assigning the time to the background task.

Diffility of multiple threads

It is recommended that you use as few threads as possible, so that the usage rate of operating system resources is the lowest and improves performance. Thread processing also has resource requirements and potential collisions to consider when designing applications. These resource requirements are as follows: The system will use memory as the context information required for the process, AppDomain object, and threads. Therefore, the processes that can be created, the number of AppDomain objects and threads can be limited by available memory.

Tracking a large number of threads will take up a lot of processor time. If there are too many threads, most threads do not have a significant progress. If most current threads are in a process, the scheduling frequency of the thread in other processes will be low.

Execution is very complicated using many thread-controlled code and may generate a lot of errors.

Destroy the thread needs to understand the possible problems and process those issues.

Providing sharing access to resources can conflict. To avoid conflicts, access to shared resources must be synchronized or controlled. Access synchronization (in the same or different application domains) can result in a deadlock (in the same or different application domains) (two threads stop responding, and are waiting for the other party) and the contention (two due to accidental appearance Problems such as critical dependence of events in the execution time of the event). The system provides synchronization objects that can be used to coordinate resource sharing between multiple threads. Reducing the number of threads makes the synchronous resources easier.

Resources that need to be synchronized include:

System resources (such as communication ports).

The resources shared by multiple processes (such as file handles).

A single application domain accessed by multiple threads (such as global, static, and instance fields).

Thread processing and application design

In general, using the ThreadPool class is the easiest way to process multiple threads for relatively short tasks that do not block other threads and do not need to perform any specific schedules for these tasks. However, there are several reasons to create your own thread:

If you need to make a task have a specific priority.

If you have tasks that may be running for a long time (and thus block other tasks).

If you need to place the thread into a single-line unit (all ThreadPool threads are in the multi-threaded unit).

If you need a stable identifier associated with the thread. For example, you should use a dedicated thread to abort the thread, hang it or find it by name.

Hosting thread process support

The public language infrastructure provides three policies to synchronize access to instances and static methods and instance fields:

Synchronize context. Simple automatic synchronization can be enabled using SynchronizationAttribute to CONTEXTBOUNDOBJECT objects.

Synchronous code area. You can use the MONITOR class or the compiler of the class to support only the code blocks of this class only.

Manually synchronize. You can use a variety of synchronous objects to create your own synchronization mechanism.

The public language runtime provides a thread model that is divided into many categories in this model, which can be synchronized in a variety of different ways. The following table shows synchronous support for fields and methods with given synchronization categories.

category

Global field

Static field

Static method

Instance field

Example method

Specific code block

Unanimous

no

no

no

no

no

no

Synchronous context

no

no

no

Yes

Yes

no

Synchronous code area

no

no

Only when marked

no

Only when marked

Only when marked

Manual synchronization

Manual

Manual

Manual

Manual

Manual

Manual

Unanimous

This is the default for the object. Any thread can access any method or field at any time. There should be a thread to access these objects.

Synchronous context

You can synchronize all instance methods and fields using SynchronizationAttribute on any CONTEXTBOUNDOBJECT. All objects in the same upper and lower fields share the same lock. Allow multiple threads to access methods and fields, but only one thread access is allowed at any time.

Synchronous code area

You can synchronize code blocks, instance methods, and static methods using the Monitor class or compiler keyword. Synchronous static fields are not supported.

Visual Basic and C # both support the use of specific language keywords to mark the code block. The final generated code will try to get the lock when the code is executed. If the lock has been acquired, the code being executed will wait until the lock can be used. When the code exits the synchronous code block, the lock is released. You can also use the methodimplattribute modification method and deliver MethodImploptions.synchronized, which is the same as the compiler keyword with Monitor code blocks. Thread.Interrupt can be used to make threads out of the blocking operation (such as waiting for access synchronization code area). Thread.Interrupt is also used to make thread.sleep and other operations.

The Monitor class supports the following code block synchronization:

Synchronize instance method. In an example method, the current object (THIS in C # is used in Visual Basic for the ME key) for synchronization.

Synchronous static method. In the static method, the class is used for synchronization.

Compiler support

Visual Basic and C # are supported using Monitor.Enter and Monitor.exit to lock the object's language keyword. Visual Basic supports the SYNCLOCK statement; C # supports the LOCK statement.

In both cases, if an exception is thrown in the code block, the lock obtained by LOCK or SYNCLOCK will be released automatically. C # and Visual Basic compilers use monitor.Exit in the FinalLip at the Finally block when issuing a TRY / FINALLY block. Using Monitor.Exit in the finally block. If an exception is triggered inside the Lock or SyncLock block, the Finally handler will run so that you can perform any clear work.

Manual synchronization

Synchronous class Interlocked, Monitor, ReaderWriterlock, ManualReveTevent, and AutoreteTevent are used to protect the lock and release locks to protect global, static, and instance fields and global, static, and instance methods.

Thread.suspend, garbage collection and safety point

When THREAD.SUSPEND is called on the thread, the system does not perform this operation immediately. Instead, it noteds that the request thread hangs and then waits to the thread to reach a security point to hang the thread. The security point for thread is the safety point for garbage collection.

There are two ways to reach garbage collection safety points:

If some code causes garbage collection, and the public language runtime can successfully modify the return address on the stack so that it can return to the running library instead of its caller, the run library will hang this thread at this moment.

If there is no cycle of any call due to some reason, garbage collection (if it contains calls, the runtime will be able to modify the return address and control the thread to make garbage recycled in the security point), and the regular real-time (JIT) compile will This condition is detected and the entire method of the included cycle is completely interrupt. All instructions in the method can be safely used for garbage recycling because it will create a garbage collection table for each instruction.

To perform garbage collection, all threads must hang, of course, except for recycling threads. Each thread must be placed in a safe point before you can hang. The reason for placing the thread in the security point is not limited to this. For example, the run library may be controlled by threads due to aborting threads.

Managed and non-trapping thread processing in Microsoft WINDOWS

All thread management is implemented through the Thread class, including threads created by the public language runtime and created and entered the managed environment outside of the run library to perform code threads. Running Surveillance All threads of code that have been performed in the managed execution environment in its process. It does not track any other thread. Threads can be exposed to the non-hosting domain to the non-hosting domain via InteroP (reason for the runtime), and the COM DLLGETCLASSOBJECT () function and platform call enters the managed execution environment. When the non-trapping thread enters the run library (if packaged by COM), the system will check the thread local storage area to find the internal hosting Thread object. If an object is found, the run library notes the thread. But if one is also found, the run library will generate a new Thread object and install it in the thread local storage area.

In the hosted thread, Thread.gethashcode is a stable managed thread identification. Within the lifetime of the thread, it does not conflict with the values ​​from any other thread, no matter which application domain gets this value.

Note Because the non-hosting host can control the relationship between the hosted and the non-trail thread, there is no fixed relationship between the operating system ThreadID and the managed thread. Specifically, a complex host can use the fiber API to schedule a plurality of trust threads or mobile hosting threads between different operating system threads for the same operating system thread.

Mapping from Win32 thread to hosted thread processing

The following table maps the WIN32 thread processing element to its approximate runtime equivalent element. Note that the mapping does not represent the same function. For example, TerminateThread does not execute a Finally clause or release resources and cannot be disabled. But thread.abort can perform all rollback code, reclaim all resources, and can use Resetabort to reject. Be sure to read the document carefully before you assume the function.

In Win32

In the public language run

CreateThread

Combination of Thread and ThreadStart

TerminateThread

Thread.abort

Suspendthread

Thread.SUSPEND

ResMethread

Thread.Resume

Sleep

Thread.sleep

WaitforsingleObject on the thread handle

Thread.join

EXITTHREAD

No equivalent

GetCurrentThread

Thread.currentthread

Setthreadpriority

Thread.priority

No equivalent

Thread.name

No equivalent

Thread.isbackground

Close to CoinitializeEx (Ole32.dll)

Thread.ApartmentState

Hosting thread and COM unit

A hosted thread can be marked to indicate that it will carry a single-threaded or multi-threaded unit. The Thread.ApartmentState property is used to return and assign a unit state of the thread. If the property has not been set, the property will return ApartmentState.unkNown.

This property can be set when the thread is in ThreadState.unStarted or ThreadState.Running status; but a thread can only be set once. Two effective attribute states are single-threaded units (STA) or multi-threaded units (MTAs). Each state has little impact on the managed part of the application. The termination program thread and all threads controlled by threadpool are MTA.

The behavior of hosted objects disclosed to COM is like they aggregate free thread seal disassembler. In other words, they can be called from any COM cell by free thread. The only hosted object that does not show this free thread behavior is those who are derived from ServicedComponent.

In the hosted domain, SYNCHRONIZATIONATTRIBUTE is not supported unless the context and context binding managed instances are used. If EnterpriseServices is used, the object must be derived from ServicesDComponent (it itself is derived from contextboundobject). It always follows the COM rules when the hosted code is called to the COM object. In other words, it follows the OLE32 to call through the COM unit agent and COM 1.0 context package.

Blocking problem

If the thread is invisible to the operating system of the thread in the blocking unmanaged code, the runtime will not control the call for Thread.Interrupt or Thread.abort. In the case of THRead.abort, the run library is tagged to the Abort and control it when it re-enters the managed code. Use managed blocking without using unmanaged blocking. Waithandle.waitone, Waitany, and Waitall, Monitor.Enter, Monitor.block, thread.join, gc.waitforpendingfinalizers, etc., will respond to Thread.Interrupt and Thread.abort.Abort. Similarly, if the thread is in a single line unit, all of these managed blocking operations will correctly send messages in the unit when the thread is blocked.

Thread activity status

Thread.ThreadState property provides a bit mask indicating the current state of the thread. The thread is always in at least one possible state in the ThreadState enumeration, and can be in multiple states simultaneously.

This thread is in the UNSTARTED state when creating a tube thread. The thread will always remain in the UnStarted state until it moves it to the started state by calling thread.start. The non-trapping thread entering the hosted environment is already in the launched state. Once in the started state, you can perform a number of operations to make the thread change status. The following table lists the operations that make changes and the corresponding new state.

operating

New state obtained

Another thread calls thread.start.

constant

Thread response thread.start and start running.

Running

Thread calling thread.sleep.

Waitsleepjoin

Thread calls Monitor.Wait on another object.

Waitsleepjoin

Threads call THREAD.JOIN for another thread.

Waitsleepjoin

Another thread calls thread.suspend.

SUSPENDREQUESTED

Thread Response.

Suspended

Another thread calls Thread.Resume.

Running

Another thread calls Thread.Iterrupt.

Running

Another thread calls Thread.abort.

Abortrequested

Thread response through.abort.

Aborted

Due to the value of the Running state is 0, it is impossible to perform the bit test to discover the state. However, you can use the following test (indicated by pseudo code).

IF ((state & (unstearted | stopped)) == 0) // IMPLIES RUNNING

At any given time, the thread is usually in multiple states. For example, if a thread is incorporated in Wait call and the other thread calls the same thread to call Abort, the thread will be in the WaitSleepJoin and the AbortRequested state. In this case, once the thread returns or interrupts from the call to Wait, it will receive ThreadAbortException.

Once thread leaves the unstearted status due to calling thread.start, it will never return to the unstearted state. Similarly, threads will never leave the stopped status. Thread level

Hosting threads or background threads, or the front desk thread. The background thread does not cause the hosted execution environment to be active, in addition to this, the background thread is the same as the front desk thread. Once all front threads are stopped in the hosted process (where the .exe file is a managed assembly), the system will stop all background threads and shut down. By setting the Thread.isBackground property, you can specify a thread as a background thread or front desk thread. For example, by setting Thread.isbackground to TRUE, you can specify the thread as a background thread. Similarly, by setting the ISBackground to false, the thread can be specified as the front desk thread. All threads from the unmanaged code enter the managed execution environment are marked as a background thread. All threads generated by creating and launching new Thread objects are reception threads. If you want to create a front desk that wants to listen to some activities (such as socket connections), thread.isbackground should be set to true for the process to terminate.

Thread local storage and thread-related static fields

You can store a single thread and application domain unique to the host-related static fields using a managed thread local storage area (TLS) and thread. If you can expect your exact needs while compiling, use thread-related static fields. If you can only find your actual needs at runtime, use the managed thread local storage area.

In non-managed C , TLSalloc can be used to dynamically allocate slots, and the variables should be distributed relative to threads using __declspec (thread). Thread local storage and thread-related static fields provide a managed version of this behavior.

Thread local storage area

The hosted thread local storage provides a unique dynamic data slot for a certain thread and application domain combination. The data slot includes two types, namely, named slots, and unnamed slots. Because the mission identifier can be used, the naming slot may be very convenient. However, other components can be intentionally or unusedly modify or unintentionally modified by using the same name with their own thread-related storage area. However, if the unnamed data slots are not disclosed to other codes, the data slot cannot be used by any other components.

To use managed TLS, simply create a data slot using thread.allocateDataSlot, and use the appropriate method to set or retrieve the information in the slot.

Thread-related static field

If you know that a type of field should always be unique to a thread and application domain combination, then use

ThreadStatiTRibute

Modified static field. It should be noted that any class constructor will run on the first thread in the first context of the field. This field will be initialized in all other threads or contexts.

NULL

(in

Visual Basic

In the middle

Nothing

). Therefore, you should not rely on class constructor to initialize thread-related static fields. It should always assume that thread-related static fields are initialized

NULL (Nothing)

.

转载请注明原文地址:https://www.9cbs.com/read-28548.html

New Post(0)